What I’ve been up to:
- Attended TTT 2020
- Compiling my projects
Summarising TTT 2020
I found an artwork called GLEI inc. Plasticization Clinic, a speculative pop up store which resonated with my concept. There are some keywords I can get from it:
- Installation presenting information both factual and speculative
- speculative-fiction-based installation
- perform as a laboratory technician
- Participate in the fiction
I shared this with Fizah as well since this project is closer to her’s than mine, but my installation will step into this territory as well.
A paper by Stavros Didakis, Augmentations of Perception++, is the one that I’m most keen in listening to. Unfortunately, the research done is very similar to mine, so there isn’t much to add.
I think the talks on topics I’m interested in are generic, research that already exists and known. But there are also many other interesting talks which doesn’t link to what I’m intending to do, but are interesting nonetheless. Some of which I took notes on.
File, if anyone is interested to see:
What I’m doing…
Since the crit, I’ve been resting so now I need to build up momentum again to work. To do so, I’ve started to create more detailed briefs for each of my device so I’m able to understand my project better when I’m building it. So far, I’ve completed 1 brief: Chronophone ; and I’m working on my overall concept one and the money sensor one.
Internalising a sense of time, schedule and rhythm for the future human to have an increased productivity.
Extending chronoception to become better cyborgs, allowing us to sense something that all human beings care about most — time.
Chronophone is a wearable headpiece that translates time to sonic feedback, training the user to tell time using sound frequencies that varies through the day.
The model I’m making is a chronoception training device that’s available as a wearable for its user to learn first before committing to an implant.
Time is an important part of human life, and we do not have an accurate sense of it except for our circadian rhythm and our judgement of the sun’s position during the day.
To help us tell time, we use tools — Sun dials, watches, clocks; or other cues like bells or certain scheduled sounds (i.e. the LRT outside my house which activates every 4am). These external devices assist our perception, but do not directly contribute to it. In this project, I plan to explore having time directly interfaced into our body through cybernetics, such that our body and brain become the tool to tell time itself, rather than the need to refer to an external device.
This is done in speculation of a future where cybernetically enhanced senses are normalised. In this future, we are able to use sensory substitution to create new stimulus, enabling us to quantify stimulus and sense things we never could sense before. In this project, the sense of time is the main focus, where we look at how we can integrate chronoception into the everyday part of our lives, going beyond watches or globally synced clock.
The nature of the work is somewhat intrusive into our private lives, as the device will be constantly providing us with sonic feedback. One will question the need for such a device, as well as wonder if future humans really do need this device.
How it works (Theoretically)
Neuroplasticity in our brain allow us to reorganise, change, and grow neural networks in our brain. This helps us to map new senses into existing ones, enabling sensory substitution where one can use another stimuli (eg. sound) to stimulate another sense (eg. sight).
This of course requires training, where the user is exposed to long duration of constant new stimuli in order for their brain to rewire themselves to the new stimuli. The training has to be specifically targeted at the senses that is to be replaced.
Theoretically, if we were to map the whole spectrum of hours, minutes, and seconds into different sound frequencies that can be distinct from each other, we can train a user to the sound and allow them to tell time through this method.
How it works (Technically)
I use a real time clock in TouchDesigner to extract the following datas: Hours, Minutes, Seconds, Days.
Then, I created an audio oscillator CHOP and created a range of specific frequencies for each of the data (except days which I have not used yet)
Using a series of logic and math CHOPs, I mapped the time to the frequencies
This file can be streamed or exported for future use.
- TouchDesigner software
- Bone Conduction Headphones
- 3D Printed headset parts
- Head dummy
Steps to take
- Export 24h sound
- User testing — keep using the product
- Foam or clay mockup of model with model head dummy
- 3D model the headset to fit bone conducting model and head
- Print and test, re-iterate until it’s fitting
- Plan interaction — buttons, functions, etc.
How do I make it more interactive?
If I still have to use a device, then it’s still going to be a tool. How do I make it such that it’s not a tool, but an actual part of the body?
- The answer is perhaps implant — but I’m not making an implant, so I’ll be making it more of a training device.
Integrate habits, goals, productivity, and routines?
Also I did some small stuff:
So far for Chronophone, I’ve tested a little bit with a 1 hour long video of the sound of minutes and seconds. I’m trying to export a better version where it’s just an MP3 instead of a video so I can just listen to it. I’m also thinking of using an old phone to run it so I can still use my phone if I’m playing the video version (which is better as I can see the time on it directly)
Next thing I want to do is to export a 24 hour version, and to borrow a head model from Galina so I can make a mockup using foam or modelling clay, before going to 3D print the headset.
I’m planning to go full production mode next week, working on the headset model. Meanwhile, I’ll continue to write the briefs, and maybe do some reading since I have so many backlogged tasks (not super important as I think I have enough research tbh)