011 / 012 – FYP Progress

Chronophone – Speculative Chronoceptive Headset

The device provide its user time information through a constant stream of sonic feedback, using the sense of hearing as a substitute for a sense of time. Information transmitted gets processed in the hearing part of the brain, through prolonged exposure, one will be able to make sense of the time/sonic information, similar to the feeling of being in the water after a long day of swimming. The device is worn on the head, with the sound transmitted through bone conduction to instil a better chronoception. This helps us have an internalised sense of time, instead of us having to rely on watches, clocks, and alarms.

The concept behind this device is my take on a future new line of wearable products that trains us to tell time instinctively. The device is also an ultimate timing device that can train our rhythm, timing intervals, etc.

Why time?

Time is a very big part of our lives, controlling our day and our activities. We wear wrist watches and install clocks in our homes and offices so we can know what time it is. I ask: “how can sensory substitution help us bring this further and improve the way we tell time?”

Circuitry for my device, made using Fritzing

Properties to play around:

  • intervals
  • length of tones
  • number of reps
  • pitch tone
  • clicks
  • vibration?

Various combination of feedback I think will be good:

  • Constant interval; constant tone length; varying pitch tone; varying reps
  • Constant interval; varying tone length; varying pitch tone; varying reps
  • Constant interval; varying tone length; constant pitch tone; varying reps (like morse code)

Also, I think seconds should be the one varying in reps & tone length (and maybe pitch?), minutes should be varying in pitch and have intervals, hours should be varying in pitch but more of a background constant hum


Constant intervals between seconds, minutes, hours, days (S, M, H, D); constant tone length; varying tone

  • very distracting and slow convey of information
  • very washed out information after prolonged exposure: the minute and hour tone isnt very obvious

… … …

Week 11/12 (which week is it now??)

During consult with prof last week, I proposed to experiment with TouchDesigner instead of a buzzer. Doing so allow me to stack the tones which makes it easier to have a feel of how using the device will feel like.

Just to recap, I think the ideal sounds will be:

  • Droning low hum for hours — signifies the length of an hour
  • Mid-low tone for minutes — something neutral and not too distracting
  • Clicks for seconds — Much like clock ticks which we are used to, and is not super disruptive.

With this, I set off to use clock and Audio Oscillator CHOP to create the tones. (BTW I’ll use an audio spectrum to determine the specific frequency if I can edit the code before showing this to y’all so if you see this message it means I haven’t done that) (for now I think the unit is in beats per seconds? I think?

  • For Seconds, I used pulse on frequency of 1bps, resulting in a single click every second. This is not ideal as I can’t customise much in terms of the properties of the “click” sound but for now we will use this.

  • For minutes, I used Sine (was thinking of Gaussian also) waves and a frequency of 100 ~ 300, ranging from 0 to 59 minutes (within 1 hour). The effect was a nice tinnitus-like sound which I think isn’t too bad since it sounds quite natural. I try to keep the tone as clean as possible so it doesn’t get too distracting, so Sine and Gaussian works best (but I chose Sine since it’s cleaner)
  • I think the only issue I have with this is that everytime it transits from 59min to 0min, the sound changes drastically. I think this can be good and bad — good because people can immediately tell that it’s a new hour. Bad because it’s too drastic.
  • Maybe I can do a lag (or I forgot which function?) to transit the sound smoothly

  • For hours, it’s more complicated. I was trying to map the sounds from 0 to 11hours, then from 13 to 23hours. But that will mean that the frequency of the droning sound increases from 12am till 12pm, then decrease after 1pm which seems unintuitive.
  • So I used a bunch of math and logic CHOPS to work around it and set a suitable time: the frequency increases from almost inaudible droning from 4am till 3pm. This makes it better cos I think the droning should be the most minimal when sleeping, and most obvious when awake.
  • from 4pm till 3am, the droning frequency decreases, just as the day goes into night.
  • The droning sound is Triangle waves with frequency set between 20 to 100bps so it gives a nice machine-like whirling sound which, to me, feels good to hear as it’s similar to machines whirling in the background. It’s kinda ambient to me so I feel that it’s appropriate.
How the bone conduction headphone is worn
Shah’s bone conduction headphones
Buttons on the underside

Thoughts on this

I think what I can do next is to get a earpiece and wear it for a while, actively learning the frequencies and associating them with the actual time. I want to see what happens.

Any feedbacks on the sound? I want to finalise it a bit before testing it, anyway it will take some time for the headphones to arrive so I can refine the sound.

Also, to work further, I can add more functions (I will talk more on that later)



So I think one big question is: what if we are in a place with loud noises? Will this device be useless? I borrowed Shah’s bone conduction headphones for a while and found that it’s not super effective too. I think this will be something I need to experiment and work on.


Maybe to market my device as an implant rather than a wearable which will make more sense. But I think within the speculative setting, my device seems to fit in.

Also, I can safely say that for this device I don’t need to use Arduino at all anymore. If TouchDesigner or any desktop program works, why not just use it? Since it’s speculative, I don’t need to explain how it works outside of the bluetooth pairing. It could be a manufactured internal timing chip linked to GPS timing, with its internal smart system blah blah blah future technology.

The most I see myself working on is to create buttons that can be used for timing and other functions within this device, connect it with a wifi module and batteries, connect to Adafruit.io which connects to the main program (TouchDesigner?), bluetooth paired with the device. This works within the exhibition, I’m happy.


I’ll be ordering some cheapo bone conducting headphones I found online and test them out. If they work, I will reskin them with a sick cyberpunk casing instead of building a whole new thing. Since it is bluetooth, I can pair this with a laptop and run this within the exhibition and users can wear it and walk around the room.

Sounds simple… I hope…

Further Thoughts

Anyway, I’m considering developing this device as much as I can as I see the potential. From here, I can add more functions like timers, or different sound modes, and I can add more sounds which I found interesting within just Audio Oscillator CHOP alone and I think there are more options to play with.

However I don’t want to overpromise also, so this is still just a thought.

Some other research

For another module, I had to write a short paper on anything related to design, so I wrote on my research topic and started from the root of my idea inspired by Neil Harbisson, Moon Ribas, and the Cyborg Arts / Cyborg Foundation. After revisiting the research I got more clear of my device as I think their motto resonated with me and my concept.

I also discovered that Neil had created something I’m working on now which is a headpiece that tells time.

Image taken from https://thoughtworksarts.io/projects/time-sense/


Only difference is the interface: his uses heat around the head, which can be a little crude since heat isn’t super precise in my opinion. Mine is pitch tone which is more complex since I have to map the appropriate pitch to the hours, minutes, seconds. But seeing this makes me feel more validated since someone tried doing this before, it could probably work. And also looking at how little wires I’ll probably use… (actually none?!) makes me happier 😀

Moon Ribas also did something called Kaleidoscope Vision where she wore goggles that allowed her to only see in colour blobs. She also did a thing called SpeedBorg where she creates a earring that detects surrounding speed and returns vibration feedback. She was then able to learn how fast people around her walks. Her learning to use the goggles and reframe her perspective, plus her being able to learn the speed of people around her and using it as art, gave me hope that my device can work if I test it by wearing it myself.

Image taken from https://www.lupiga.com/vijesti/katalonci-cudnim-zvukovima-odredili-boju-zagreba
Image taken from http://www.weltenschummler.com/tech-science/rp13-how-to-become-a-cyborg/

In short, I’m trying to create a device that turns people into cyborgs.

Just a small update to strengthen my concept.

Plans for next 2 weeks

  1. Order the parts I need
  2. (refine?) the sound
  3. Once done up, start wearing the device while syncing it with my laptop and actively learning the times
  4. Update Notion page (man I always get anxious when I don’t plan but it’s the least productive thing I ever do and I don’t really follow my plans anyway…)
  5. Start working on next product: Weather Sensograph… or the money sensor
  6. Presentation slides for second interim
    1. Refresher short intro on concept
    2. Progress documentation(?)
    3. Moodboard and sketches(?)
    4. Current prototype
    5. Timeline ahead

Small mental note

Anyway I’m happy with my current understanding of my project. Although still a bit dissatisfied that my points are still all over the place, and it’s hard for me to pinpoint what exactly is the crux of my concept and what backs it up, I’m no longer struggling with the thinking part of the project… At least for now. I hope I’ll be able to make sense of everything when I write my report 🙁

009 / 010 – FYP Progress

What I have for now:

Ok the past 2 weeks I was busy with other things so I really apologise for the lack of progress 🙁 I’m finding it hard to cope with workstudy and side hustle, and I know it’s affecting my FYP. I’m making effort to reduce all these external commitments and focusing on FYP.

I have not followed through with making more prototypes, sketching, thinking about ideas, or even researching.


My only real progress was when I made 1 Anchorwatch prototype which I’ve shown on Wednesday. This was done last week.

Thanks Shah for allowing me to use his filament and 3D printer!

oops the part broke
the case for the Arduino Nano, which doubles as a cap for the part that houses the batteries
strap to improvise the lack of fitting

After building this, I think it won’t work because the results feels like it’s too direct to me, like it doesn’t feel impactful, doesn’t feel convincing enough. As mentioned early in the semester, I’m doing this just for experimentation. Wearing this for a while didn’t make me feel confident about it. It’s wobbly and its haptic feedback is weak. Even if it may work, it’s still a “copy” of Auger Loizeau’s Subliminal Watch. This definitely need more work to develop further, but I’m thinking of a better idea already.

Instead of continuing to develop Anchorwatch, I want to apply the same concept of internalising Chronoception through the “Sound Watch” as mentioned in my previous post. This is, to me, a clearer and more multi-dimensional device than the Anchorwatch.

I think this has more potential. Why:

  • I’m using bone conduction, which directly links the device to our body, giving a much more obvious feedback.
  • The device feeds more information (hour, minutes, seconds), and potentially has the same effect as Anchorwatch where the user can learn intervals, except the user gets more out of this with more information.

For now this is enough to convince me that this will be a better idea. I will use what prof mentioned about finding relevant research to back this up.

Some things I have to think about during my experimentation:

  • What does hour, minutes, seconds sound like? How do I make this sound natural to us, with almost 0 disruption?
    • Seconds should sound short, clicky. 60 different tones for 60 seconds.
    • Minutes should be more balanced, a subtle tone in the middle range. 60 tones that gradually change as the minutes pass.
    • Hours should sound long, humming, and always in the background.
  • What tech should I use? How do I build it? I need a proper blueprint and I need to look for the right components.

I should also consider the appearance. For quick reference, here’s the look of the device I’m going for:

Image taken from https://www.claytonbartonartist.com/?_escaped_fragment_=sci-fi-soldier-helmets.jpg%2Fzoom%2Fcx2j%2Fimage1qky
Image taken from https://www.designboom.com/technology/hiroto-ikeuchi-cyberpunk-wearable-technology-04-02-2018/?utm_source=designboom+daily&utm_medium=email&utm_campaign=surreal,+yet+functional
Image taken from https://www.deviantart.com/edsfox/art/White-fox-sketch-560197326
Image taken from https://twitter.com/archillect/status/695210207601893377/photo/1
Image taken from https://www.artstation.com/artwork/qNBgL

Other things…

I don’t have anything else that is super concrete yet. I’ll spend more time this week to work on it. :’)

3-Week Plan Ahead

  • Week 11
    • First thing I’ll do this coming weekend is to reorganise all the accumulated research and consultations to make sense of everything. So when Monday comes I can immediately start refreshed. I’ll also update my FYP page (timeline, keywords, concepts, etc) for my own good.
    • Start getting hands dirty again with Sound Watch prototype, including trying to get the appearance to look like a proper product. I’ll be using buzzers first instead of bone conduction to get the feel of the sound first.
    • Research on the science that apply to Sound Watch and use them when describing my project next week.
    • Consult prof when I’m done (probably Friday)
  • Week 12
    • Start working on Weather Sensograph (or Third-I or Money Sense)
    • Going through the same process: do the appropriate research and not get overwhelmed by information
    • Start working on presentation slides for interim presentation
    • Consult prof when I’m done.
  • Week 13 (Friday 13th presentation)
    • Start working on 3rd prototype (doesn’t have to finish)
    • Finish all slides
    • Rethink about concept and progress to ensure things are sound and for clarity

008 – FYP Progress

What’s done

  • First prototype of Anchorwatch (renamed cos vibrawatch sound weird) (very lo-fi)
  • 3D model of Anchorwatch (to be 3D printed)
  • 3D model of eye-to-eye periscope (to be 3D printed)
  • Very small amount of sketches to visualise Weather Sensograph, not enough concrete stuff to show everyone… :’)

Anchorwatch Prototype 1

Did a super ratchet prototype to test out and to quickly kickstart my work. I used a paper plate to cut into the shape of a watch, and used masking tape to put together the circuit.

Circuit consists of:

  • 1x Arduino Nano
  • 2x 3.7v 1500mAh lipo batteries connected in series
  • 1x switch
  • 1x vibration motor
  • 1x 220ohm resistor (to reduce vibration strength)

Code: just a simple if else statement with a millis timer. Also ignore the values I set for minutes and seconds, I was just testing and stuck with just 5 seconds for the variable “minutes” (just to prevent confusion (like every second is actually 2 seconds!))

This is definitely not gonna work, even though it works technically. Nobody wants to wear this, and it will definitely fall apart soon. I’ve made a 3D model and will start 3D printing tomorrow to make sure I have a stronger structure for the watch so I can start wearing it proper.

Not sure how it will fit my hand, will try it out asap. Also, I realised I forgot to include a part for the switch so I guess I’ll find a way……

Anyway, to recap the idea of this watch: It’s a watch that vibrates every 5 seconds (as of now, will adjust accordingly to see what’ll work during user testing aka myself), anchoring the user to learn the interval subliminally, allowing them to eventually time the interval without any aid or using of any tools like a watch.

Eye-to-eye Periscope 3D model

I’m gonna 3D print this as well so there is a proper and strong object to hold. I think I’ll just make a proper mockup of this so I don’t lose the idea even though it’s just a sensory experiment.

More ideas! More things to start…

Weather Sensograph

So I’m starting on a weather sensing device which I’ll call Weather Sensograph for now. The idea is to help us humans to have an innate weather-detecting sense, like how the Anchorwatch helps us in enhancing our chronoception. This is done through a wifi-connected device that uses weather API data (or any other reliable source of data pls tell me if you know cos I’ve only ever used API). The device uses information from the API and converts it into kinaesthetic feedback (the feedback I’m thinking will be appropriate… for now). This new sense will help us evaluate weather better, on top of just relying on our sense of sight, smell, hearing (which won’t be applicable when we are indoors which we are most of the time).

I explained the idea to Mark and he came up with a good point, that the device will help better if we can also detect information about weather that we won’t know about without the device, for example, where the rain is, and when will it rain. I think giving access to these information will be a more interesting approach to this device, but I don’t have much idea of how I can do this as of now as I’ll need to know the direction of the weather approaching the user’s relative standpoint and where the user is facing in order for this to work. So far, API (IF IM NOT WRONG) do not provide those info.

I think one way to do this is to have some kind of a seismograph type of feedback to weather to help us sense the weather’s intensity and direction. Mark suggests that I look up hair-tension hygrometer. (THANK U MARK U R GR8 HALP)

I think it’s this thing? Image taken from https://manual.museum.wa.gov.au/book/export/html/89

Sound Watch

The idea of this is to change the way we tell time with sound rather than visuals on a clockface. Recently I saw Aisen Caro Chacin’s Play-A-Grill and thought about bone conduction.

Play-A-Grill drawing. Image taken from http://www.aisencaro.com/play-a-grill.html

I’m thinking of mapping the hour, minutes, and seconds to different frequencies or sound cues, layering them, and using bone conduction to let us hear sound. This will help us in:

  • intuitively tell time just with sound cues
  • constant knowledge of time as the sound is always playing when one wears the device
  • Help the visually impaired

Imagine the application once everyone learns this new way of telling time. Clocks around the world can also be tuned to the frequency, played out loud so everyone can hear it (although this will be devastating to animals that rely on the sense of hearing as well as power supply)

Money Sensor

One of the more applicable ideas I have now is this. Money has become a necessity much like food. We can feel hunger and thirst but not how much money we have. This can be something worthy to explore, but I’ll work on the top few ideas first then I’ll come to this one once I’m done.


Looking back at my Interactive Devices project, I think I can try to simplify it and make it part of this series of devices.

A common theme?

The ideas I have so far are only connected by 1 factor: A “I think they will be useful” statement. I haven’t linked it to the big picture concept which is the cyberpunk future. I think that will always be at the back of my mind but like I said before, I’ll just start making all these ideas a reality first and see what I can do afterwards. Actually these prototypes don’t take a long time to create so I can just keep making and think along the way.

What’s next?

  • Finish 3D printing Anchorwatch v2, finish second prototype and start wearing it; evaluate and adjust it along the way.
  • Work on Weather Sensograph, and then Sound Watch, then the money sensor
  • Continue ideating and read journal articles when I’m burning out from making. Also when I’m free or available to multitask, I’ll watch videos and sci-fi films.
  • Continue thinking about concept along the way and try to tie things together cos now everything feels very not unified
  • Start planning to interview people: I have this idea to interview or have conversations with people who wear implants or are interested in the topic of cyborgs, futuristic concepts, speculative design.

Recess Week Updates

Progress Thus Far

  • Done research & reading on Cyberpunk genre
  • Not done moodboard (rescheduling to another time to focus on, as it’s not important now after reviewing)
  • Sorted out and filtered journal papers that are relevant
  • Made eye-to-eye periscope, working on vibrawatch prototype
  • Read Sensory Pathways for the Plastic Mind by Aisen Chacin
    • Realised I’m doing something very similar to what she is doing, but with different intentions. She wanted to critique on the idea of sensory substitution devices as assistive technology, and want to normalise the use of these devices. For me, I want to go beyond normalising, I wanted to pretend that it is already a norm, and imagine what people do with it.

Next focus

  • Finish vibrawatch prototype
  • Sketching of ideas so I can present them and visualise them
  • Play with creating said ideas and test them
  • Read when I have time
  • Focus on sensory experiments and exploration with current few ideas

I have changed my focus from research to hands-on after talking to Bao and Zi Feng who gave me a number of good advices!


I went to research a bit deeper on cyberpunk genre. This is done through reading Tokyo Cyberpunk: Posthumanism in Japanese Visual Culture which, to be honest, was not very helpful as it was a lot of text analysing deeply on different films or animes that tackle different cyberpunk themes. What I found useful were the references to certain films or animes that I can search online for video clips or photos to refer to. This is speaking in terms of the aesthetics.

After reading the book, I realised there are more to the genre than just it’s aesthetics (I mean of course, just that I didn’t really think about it). I went to watch some videos on the topic and came up with some insights:

  • Cyberpunk genre revolves around themes such as mega corporations, high tech low life, surveillance, internet of things, identity in a posthuman world (man / cyborg / machine, where is the line drawn?), humanity and human the condition in a tech driven world.
  • The aesthetics is not a main point of the cyberpunk genre, as previously thought. It is all about the themes, that’s why many films that are “cyberpunk” may not have the iconic aesthetics of neon holographic signs, mega cities, flying cars, etc. Cyberpunk genre can exist in the era we are living in now and does not necessary have to be set in the future.
  • It might be more realistic to look at the current world and set the “look” accordingly rather than to refer to the 80’s vision of the future aka what most cyberpunk depiction look like.

So why are these info useful for me?

I think it’s good that I actually try to discover what cyberpunk really is and at least get a brief idea of it. I can try referencing certain themes that apply to Singapore and see what works best. How will Singapore be like if we are set in a cyberpunk world? This does not only affect the aesthetics of my installation, but it also affects the theme I’ll be covering, which can narrow my scope a bit more. In the end, I hope that the installation will be realistic and aesthetically convincing as well.

Also, I got my hands sort of dirty

I had the idea of a periscope that lets you see your own eye. This is more of a perception experiment than anything really important.

As a thought experiment, I was wondering what we will see, if one eye sees the other, and vice versa. Turns out, nothing special. You will see your eyes as though you see it in a mirror — except the mirror is not flipped. I can’t think of an application, but it is interesting to play with it. If you close one eye, you can see your closed eye with the other eye.

I show this to Zi Feng and Bao, and Zi Feng mentioned that if the mirrors are the scale of the face, you can see your true reflection with this device, with only a line separating the middle, which I think is an interesting application for a mirror that shows your actual appearance (how other people sees you).

But yeah, it’s not anything super interesting. I just wanted to start working on something haha.

Anyway, I’m also working on the vibrawatch, but it’s still WIP.

What’s next?

I’m gonna start experimenting with making sensory prototypes. I’m also starting on a device that responds to weather immediately after I’m done with vibrawatch.