NEW YEAR STAGNATION

What’s going on for the past ??? month(s)?

3D model and 3D model testing

I’ve done a few iteration with 3D modelling of the headset. It was a tedious process to begin with as I don’t really have a sense of how the form is to be made through 3D modelling even with reference to all the physical models I’ve made.

cold start with 3d modelling

I decided to try downloading trial for Rhino 7 and tried SubD which is basically T-spline, which I thought will be good for modelling this since its more like sculpting

soft edges and difficulty working with certain areas (i.e. the ear part)

Eventually I realised it’s easier to just model using the normal CAD method. Before that, I tried repurposing real products like earmuffs and visors from Daiso.

Using the band of a visor to hold the back of the headset
Took apart a earmuff to figure out what’s inside
Slipped the bone conduction headset inside the earmuffs, worked really well.

I realised that I can make everything really thin and even can try fabric instead of tough materials. More on that later in the next section.

This led me to realise I can use the shape of a hairband to create the form of my device.

Using a hairband as a gauge of the bend, I’m able to successfully model a “hairband” model
using paper and my actual hair band to prototype
Printed results
model wireframe

I then up the complexity to try adding all the other parts of the original design. I roughly made the form as this is still just a test print. I created a low-poly version to 3D print as the file size got too big for 3D printing

Results:

I tried different materials as well. Black one is TPU, a flexible plastic. White is PLA which is a more plasticy plastic.

Conclusion: I think the prints are ok. I can go ahead to refine the 3D model. But before doing so, I have some changes to my approach that I will talk more about in the next section.

Change of approach

So I’m happy that I’ve begun with making stuff. But I’m scrapping a lot of the previous headset design I’ve thought about. This happened after I saw earmuffs and I thought that it’s really easier to just put the headset inside a fabric thing that has a wire inside, AKA simplifying everything.

The old design will still be worked on, still. I’ll create separate parts and add magnets to let the different parts stick. It will be on display instead and will only be interactive as a “display” model.

I don’t think I’ve mentioned before but I remembered Sputniko!’s Menstruation Machine and thought my devices can just look like that too. (I think I was also influenced by this work previously actually)

Anyway so next up on the to-do list specifically for Chronophone:

  1. Experiment with fabric version of Chronophone using earmuff design (without covering the ears)
  2. Refine 3D model of original design, to be displayed and marketed as “cyborg model” for those who have cybernetics already installed on their head
  3. Refine sounds and add variations so people can change and choose what they like

Concept recap

I think I’m finally happy with the whole Sensorial Futures project. Here’s my train of thoughts for the whole project (sorry I keep doing this cos I need to constantly remind myself)

Sensorial Futures is a speculative cybernetics augmentation popup store, commercialised for the mass public to upgrade their senses on the fly. The shop has a shiny neo-future interior with displays that allow you to try out different device, placed around like Apple devices in their shops.

In this particular pop-up, the sensory devices are catered to the everyday people and the working population, dealing with things like time (productivity), money, and personal space (transport/mobile device use). These devices range from wearables, to implants, to cybernetics. Borrowing from cyberpunk themes, I want to make it such that the wearables are essential for certain jobs and this pop-up store offers sensory upgrades to the most generic 9-5 jobbers. (thanks prof. LP for giving me that idea way back 2 months ago)

3 key devices are on display:

  1. Time-telling sound device that translate time to sound (and maybe colour). Available in wearable (main interactive device), implant (poster implying its existence, asking people to register for it), cybernetic (on display, magnetic, for cyborgs). how it will work is basically translating real time seconds, minutes, hours into sounds and frequencies and then relayed directly to the eardrums through bone-conduction.
  2. Money-sensing which is split into 2 different devices.
    1. hunger/satiation device that makes you literally hungry for money, feeding into your instinctive desire to keep you earning your money. This device will be worn on the bell (and maybe head), warming and constricting the belly while creating salivating responses. The wearable is attached onto a computer-like device housing a game that simulates earning money. The wearable is only for experience, but the actual thing will be a supposed implant that release synthetic hormones which makes you feel hungry, and metabolises them when money is earned (this will be explained on a poster or some interactive screen).
    2. intoxication/soberness device that makes you intoxicated the more money you spend. This will be worn on the head, controlling your vestibular sense in the ear while also inducing mild constriction to the head. There will be a gambling game as well, simulating losing money.
  3. Personal-space protecting sense that senses the surroundings of the person. This will be a wearable on the neck (maybe?), with sensors surrounding that allow the person to sense the 360degrees around them. This won’t work IRL of course, looking at the technology we have, so I’m going to “cheat” using Kinect to detect presense, which detects depth as well which can be relayed too. The device translate proximity information into heat and touch, allowing one to feel the proximity of others or objects around them. The device will be wearble, and if I can do it, I’ll do a cybernetic version too that’s on display like the chronophone. This device is for people who use their personal devices all the time (looking into the future I won’t be surprised if we will wear google glasses-like devices as our personal mobile device which will be a major distraction and as such require this kind of sense)

The takeaway from the installation will be: do we really need these? Will we ever need these? In this project, I will explore a few themes, some more vague than others, and I hope people will catch them and think about them. Meanwhile, other than trying to get hired, I want people to have fun with my installation while thinking about the future of senses. Or even, not just the future of senses but to think about the current ways of them sensing their environments and the current ways of them using their senses.

Timeline Changes

Yes. Too much time wasted so here comes another update to the timeline, streamlined into more specific stuff since theres only FOUR + MONTHS LEFT…………..

I’ll be working specifically on each device in 3-week intervals.

Jan week 1: wrap up chronophone prototype

Jan week 2: start on money sense

Jan week 3: finish prototype for money sense

Jan week 4: refine money sense

Feb week 1: start on proprio sense

Feb week 2: finish prototype for propriosense

Feb week 3: refine propriosense

Feb week 4: super refine chronophone

Mar week 1: super refine money sense

Mar week 2: super refine propriosense

Mar week 3: finalise all products

Mar week 4: gather installation materials, Work on fake products (misc stuff on display)

Apr week 1: start installation design

Apr week 2: mid installation logistics work

Apr week 3: finish installation

Apr week 4: add ons to installations and products

May week 1: collaterals and marketting

May week 2: presentation

 

So next week I’ll start with money sense, while putting chronophone on hold till end Feb.

Hope I’ll stick to it kthxbye

Weekly-ish Updates

Things done:

  • Finished reading Speculative Everything, should have done it earlier lol
    • Better understanding of how a speculative work should be like, and how it is approached
    • Some parts I don’t agree with, but I will still respect the ideas.
    • I will be applying some of the new knowledge into my thought process
  • Sketches of the form of Chronophone

  • Tried wire prototyping with the forms
    • Able to get the form in 3D well, I’ll continue this prototyping method for future model testing

  • Tried more user testing with the sound
    • Worn it for about 2 hours per day
    • Able to remember vaguely 3 types of frequencies: 0~10mins, 30~40mins, 50~60mins
    • I think it works enough (backed by research too) to say that it works. Mark says I don’t have to really do all the tests since its not really crucial when I show it to others, which I think is quite true..
  • Exported 24 hour sound, proving that it’s not too taxing on my computer to export more of such files.

Next steps:

Priority:

  • 3D model (having some difficulties here 🙁 )
  • Heeding valuable advice from prof and Mark and modifying my project so it’s better

Not so important for now:

  • think further on the sound — doesn’t have to be the tones, it can be more imaginative and customisable
  • I want to still think about the form as I don’t think this fits very well
  • Now I kinda don’t like that it’s referencing cyberpunk aesthetics and I  want to try to form my own aesthetics (after reading speculative everything)

Focus now will still be the 3D model and getting the form out so at least I have something to work with, then I’ll work on the other parts. Doesn’t matter if I don’t use it in the end, I think.

Updates After Crit

Presentation link:

https://www.dropbox.com/s/1qlei0em471b3l0/FYP_BryanLeow_2011.pdf?dl=0

What I’ve been up to:

  • Attended TTT 2020
  • Compiling my projects

Summarising TTT 2020

I found an artwork called GLEI inc. Plasticization Clinic, a speculative pop up store which resonated with my concept. There are some keywords I can get from it:

  • Installation presenting information both factual and speculative
  • speculative-fiction-based installation
  • perform as a laboratory technician
  • Participate in the fiction

I shared this with Fizah as well since this project is closer to her’s than mine, but my installation will step into this territory as well.

Image taken from https://www.laurenruizart.com/plasticization-clinic.html
Image taken from https://www.laurenruizart.com/plasticization-clinic.html

A paper by Stavros Didakis, Augmentations of Perception++, is the one that I’m most keen in listening to. Unfortunately, the research done is very similar to mine, so there isn’t much to add.

I think the talks on topics I’m interested in are generic, research that already exists and known. But there are also many other interesting talks which doesn’t link to what I’m intending to do, but are interesting nonetheless. Some of which I took notes on.

File, if anyone is interested to see:

https://www.dropbox.com/s/n7s60br5gqk3qyp/TTT2020.pages?dl=0

What I’m doing…

Since the crit, I’ve been resting so now I need to build up momentum again to work. To do so, I’ve started to create more detailed briefs for each of my device so I’m able to understand my project better when I’m building it. So far, I’ve completed 1 brief: Chronophone ; and I’m working on my overall concept one and the money sensor one.

Chronophone

Purpose

Internalising a sense of time, schedule and rhythm for the future human to have an increased productivity.

Extending chronoception to become better cyborgs, allowing us to sense something that all human beings care about most — time.

Short Description

Chronophone is a wearable headpiece that translates time to sonic feedback, training the user to tell time using sound frequencies that varies through the day.

The model I’m making is a chronoception training device that’s available as a wearable for its user to learn first before committing to an implant.

Concept

Time is an important part of human life, and we do not have an accurate sense of it except for our circadian rhythm and our judgement of the sun’s position during the day.

To help us tell time, we use tools — Sun dials, watches, clocks; or other cues like bells or certain scheduled sounds (i.e. the LRT outside my house which activates every 4am). These external devices assist our perception, but do not directly contribute to it. In this project, I plan to explore having time directly interfaced into our body through cybernetics, such that our body and brain become the tool to tell time itself, rather than the need to refer to an external device.

This is done in speculation of a future where cybernetically enhanced senses are normalised. In this future, we are able to use sensory substitution to create new stimulus, enabling us to quantify stimulus and sense things we never could sense before. In this project, the sense of time is the main focus, where we look at how we can integrate chronoception into the everyday part of our lives, going beyond watches or globally synced clock.

The nature of the work is somewhat intrusive into our private lives, as the device will be constantly providing us with sonic feedback. One will question the need for such a device, as well as wonder if future humans really do need this device.

How it works (Theoretically)

Neuroplasticity in our brain allow us to reorganise, change, and grow neural networks in our brain. This helps us to map new senses into existing ones, enabling sensory substitution where one can use another stimuli (eg. sound) to stimulate another sense (eg. sight).

This of course requires training, where the user is exposed to long duration of constant new stimuli in order for their brain to rewire themselves to the new stimuli. The training has to be specifically targeted at the senses that is to be replaced.

Theoretically, if we were to map the whole spectrum of hours, minutes, and seconds into different sound frequencies that can be distinct from each other, we can train a user to the sound and allow them to tell time through this method.

How it works (Technically)

I use a real time clock in TouchDesigner to extract the following datas: Hours, Minutes, Seconds, Days.

Then, I created an audio oscillator CHOP and created a range of specific frequencies for each of the data (except days which I have not used yet)

Using a series of logic and math CHOPs, I mapped the time to the frequencies

This file can be streamed or exported for future use.

Logistics required

  • TouchDesigner software
  • Bone Conduction Headphones
  • 3D Printed headset parts
  • Head dummy

Steps to take

  • Export 24h sound
  • User testing — keep using the product
  • Foam or clay mockup of model with model head dummy
  • 3D model the headset to fit bone conducting model and head
  • Print and test, re-iterate until it’s fitting
  • Plan interaction — buttons, functions, etc.

Questions

  • How do I make it more interactive?

  • If I still have to use a device, then it’s still going to be a tool. How do I make it such that it’s not a tool, but an actual part of the body?

    • The answer is perhaps implant — but I’m not making an implant, so I’ll be making it more of a training device.
  • Integrate habits, goals, productivity, and routines?

Also I did some small stuff:

So far for Chronophone, I’ve tested a little bit with a 1 hour long video of the sound of minutes and seconds. I’m trying to export a better version where it’s just an MP3 instead of a video so I can just listen to it. I’m also thinking of using an old phone to run it so I can still use my phone if I’m playing the video version (which is better as I can see the time on it directly)

Plans

Next thing I want to do is to export a 24 hour version, and to borrow a head model from Galina so I can make a mockup using foam or modelling clay, before going to 3D print the headset.

I’m planning to go full production mode next week, working on the headset model. Meanwhile, I’ll continue to write the briefs, and maybe do some reading since I have so many backlogged tasks (not super important as I think I have enough research tbh)

011 / 012 – FYP Progress

Chronophone – Speculative Chronoceptive Headset

The device provide its user time information through a constant stream of sonic feedback, using the sense of hearing as a substitute for a sense of time. Information transmitted gets processed in the hearing part of the brain, through prolonged exposure, one will be able to make sense of the time/sonic information, similar to the feeling of being in the water after a long day of swimming. The device is worn on the head, with the sound transmitted through bone conduction to instil a better chronoception. This helps us have an internalised sense of time, instead of us having to rely on watches, clocks, and alarms.

The concept behind this device is my take on a future new line of wearable products that trains us to tell time instinctively. The device is also an ultimate timing device that can train our rhythm, timing intervals, etc.

Why time?

Time is a very big part of our lives, controlling our day and our activities. We wear wrist watches and install clocks in our homes and offices so we can know what time it is. I ask: “how can sensory substitution help us bring this further and improve the way we tell time?”

Circuitry for my device, made using Fritzing

Properties to play around:

  • intervals
  • length of tones
  • number of reps
  • pitch tone
  • clicks
  • vibration?

Various combination of feedback I think will be good:

  • Constant interval; constant tone length; varying pitch tone; varying reps
  • Constant interval; varying tone length; varying pitch tone; varying reps
  • Constant interval; varying tone length; constant pitch tone; varying reps (like morse code)

Also, I think seconds should be the one varying in reps & tone length (and maybe pitch?), minutes should be varying in pitch and have intervals, hours should be varying in pitch but more of a background constant hum

Experiments:

Constant intervals between seconds, minutes, hours, days (S, M, H, D); constant tone length; varying tone

  • very distracting and slow convey of information
  • very washed out information after prolonged exposure: the minute and hour tone isnt very obvious

… … …

Week 11/12 (which week is it now??)

During consult with prof last week, I proposed to experiment with TouchDesigner instead of a buzzer. Doing so allow me to stack the tones which makes it easier to have a feel of how using the device will feel like.

Just to recap, I think the ideal sounds will be:

  • Droning low hum for hours — signifies the length of an hour
  • Mid-low tone for minutes — something neutral and not too distracting
  • Clicks for seconds — Much like clock ticks which we are used to, and is not super disruptive.

With this, I set off to use clock and Audio Oscillator CHOP to create the tones. (BTW I’ll use an audio spectrum to determine the specific frequency if I can edit the code before showing this to y’all so if you see this message it means I haven’t done that) (for now I think the unit is in beats per seconds? I think?

  • For Seconds, I used pulse on frequency of 1bps, resulting in a single click every second. This is not ideal as I can’t customise much in terms of the properties of the “click” sound but for now we will use this.

  • For minutes, I used Sine (was thinking of Gaussian also) waves and a frequency of 100 ~ 300, ranging from 0 to 59 minutes (within 1 hour). The effect was a nice tinnitus-like sound which I think isn’t too bad since it sounds quite natural. I try to keep the tone as clean as possible so it doesn’t get too distracting, so Sine and Gaussian works best (but I chose Sine since it’s cleaner)
  • I think the only issue I have with this is that everytime it transits from 59min to 0min, the sound changes drastically. I think this can be good and bad — good because people can immediately tell that it’s a new hour. Bad because it’s too drastic.
  • Maybe I can do a lag (or I forgot which function?) to transit the sound smoothly

  • For hours, it’s more complicated. I was trying to map the sounds from 0 to 11hours, then from 13 to 23hours. But that will mean that the frequency of the droning sound increases from 12am till 12pm, then decrease after 1pm which seems unintuitive.
  • So I used a bunch of math and logic CHOPS to work around it and set a suitable time: the frequency increases from almost inaudible droning from 4am till 3pm. This makes it better cos I think the droning should be the most minimal when sleeping, and most obvious when awake.
  • from 4pm till 3am, the droning frequency decreases, just as the day goes into night.
  • The droning sound is Triangle waves with frequency set between 20 to 100bps so it gives a nice machine-like whirling sound which, to me, feels good to hear as it’s similar to machines whirling in the background. It’s kinda ambient to me so I feel that it’s appropriate.
How the bone conduction headphone is worn
Shah’s bone conduction headphones
Buttons on the underside

Thoughts on this

I think what I can do next is to get a earpiece and wear it for a while, actively learning the frequencies and associating them with the actual time. I want to see what happens.

Any feedbacks on the sound? I want to finalise it a bit before testing it, anyway it will take some time for the headphones to arrive so I can refine the sound.

Also, to work further, I can add more functions (I will talk more on that later)

Application

Problem

So I think one big question is: what if we are in a place with loud noises? Will this device be useless? I borrowed Shah’s bone conduction headphones for a while and found that it’s not super effective too. I think this will be something I need to experiment and work on.

Thoughts

Maybe to market my device as an implant rather than a wearable which will make more sense. But I think within the speculative setting, my device seems to fit in.

Also, I can safely say that for this device I don’t need to use Arduino at all anymore. If TouchDesigner or any desktop program works, why not just use it? Since it’s speculative, I don’t need to explain how it works outside of the bluetooth pairing. It could be a manufactured internal timing chip linked to GPS timing, with its internal smart system blah blah blah future technology.

The most I see myself working on is to create buttons that can be used for timing and other functions within this device, connect it with a wifi module and batteries, connect to Adafruit.io which connects to the main program (TouchDesigner?), bluetooth paired with the device. This works within the exhibition, I’m happy.

Building

I’ll be ordering some cheapo bone conducting headphones I found online and test them out. If they work, I will reskin them with a sick cyberpunk casing instead of building a whole new thing. Since it is bluetooth, I can pair this with a laptop and run this within the exhibition and users can wear it and walk around the room.

Sounds simple… I hope…

Further Thoughts

Anyway, I’m considering developing this device as much as I can as I see the potential. From here, I can add more functions like timers, or different sound modes, and I can add more sounds which I found interesting within just Audio Oscillator CHOP alone and I think there are more options to play with.

However I don’t want to overpromise also, so this is still just a thought.

Some other research

For another module, I had to write a short paper on anything related to design, so I wrote on my research topic and started from the root of my idea inspired by Neil Harbisson, Moon Ribas, and the Cyborg Arts / Cyborg Foundation. After revisiting the research I got more clear of my device as I think their motto resonated with me and my concept.

I also discovered that Neil had created something I’m working on now which is a headpiece that tells time.

Image taken from https://thoughtworksarts.io/projects/time-sense/

https://thoughtworksarts.io/projects/time-sense/

Only difference is the interface: his uses heat around the head, which can be a little crude since heat isn’t super precise in my opinion. Mine is pitch tone which is more complex since I have to map the appropriate pitch to the hours, minutes, seconds. But seeing this makes me feel more validated since someone tried doing this before, it could probably work. And also looking at how little wires I’ll probably use… (actually none?!) makes me happier 😀

Moon Ribas also did something called Kaleidoscope Vision where she wore goggles that allowed her to only see in colour blobs. She also did a thing called SpeedBorg where she creates a earring that detects surrounding speed and returns vibration feedback. She was then able to learn how fast people around her walks. Her learning to use the goggles and reframe her perspective, plus her being able to learn the speed of people around her and using it as art, gave me hope that my device can work if I test it by wearing it myself.

Image taken from https://www.lupiga.com/vijesti/katalonci-cudnim-zvukovima-odredili-boju-zagreba
Image taken from http://www.weltenschummler.com/tech-science/rp13-how-to-become-a-cyborg/

In short, I’m trying to create a device that turns people into cyborgs.

Just a small update to strengthen my concept.

Plans for next 2 weeks

  1. Order the parts I need
  2. (refine?) the sound
  3. Once done up, start wearing the device while syncing it with my laptop and actively learning the times
  4. Update Notion page (man I always get anxious when I don’t plan but it’s the least productive thing I ever do and I don’t really follow my plans anyway…)
  5. Start working on next product: Weather Sensograph… or the money sensor
  6. Presentation slides for second interim
    1. Refresher short intro on concept
    2. Progress documentation(?)
    3. Moodboard and sketches(?)
    4. Current prototype
    5. Timeline ahead

Small mental note

Anyway I’m happy with my current understanding of my project. Although still a bit dissatisfied that my points are still all over the place, and it’s hard for me to pinpoint what exactly is the crux of my concept and what backs it up, I’m no longer struggling with the thinking part of the project… At least for now. I hope I’ll be able to make sense of everything when I write my report 🙁

009 / 010 – FYP Progress

What I have for now:

Ok the past 2 weeks I was busy with other things so I really apologise for the lack of progress 🙁 I’m finding it hard to cope with workstudy and side hustle, and I know it’s affecting my FYP. I’m making effort to reduce all these external commitments and focusing on FYP.

I have not followed through with making more prototypes, sketching, thinking about ideas, or even researching.

Anchorwatch

My only real progress was when I made 1 Anchorwatch prototype which I’ve shown on Wednesday. This was done last week.

Thanks Shah for allowing me to use his filament and 3D printer!

oops the part broke
the case for the Arduino Nano, which doubles as a cap for the part that houses the batteries
strap to improvise the lack of fitting

After building this, I think it won’t work because the results feels like it’s too direct to me, like it doesn’t feel impactful, doesn’t feel convincing enough. As mentioned early in the semester, I’m doing this just for experimentation. Wearing this for a while didn’t make me feel confident about it. It’s wobbly and its haptic feedback is weak. Even if it may work, it’s still a “copy” of Auger Loizeau’s Subliminal Watch. This definitely need more work to develop further, but I’m thinking of a better idea already.

Instead of continuing to develop Anchorwatch, I want to apply the same concept of internalising Chronoception through the “Sound Watch” as mentioned in my previous post. This is, to me, a clearer and more multi-dimensional device than the Anchorwatch.

I think this has more potential. Why:

  • I’m using bone conduction, which directly links the device to our body, giving a much more obvious feedback.
  • The device feeds more information (hour, minutes, seconds), and potentially has the same effect as Anchorwatch where the user can learn intervals, except the user gets more out of this with more information.

For now this is enough to convince me that this will be a better idea. I will use what prof mentioned about finding relevant research to back this up.

Some things I have to think about during my experimentation:

  • What does hour, minutes, seconds sound like? How do I make this sound natural to us, with almost 0 disruption?
    • Seconds should sound short, clicky. 60 different tones for 60 seconds.
    • Minutes should be more balanced, a subtle tone in the middle range. 60 tones that gradually change as the minutes pass.
    • Hours should sound long, humming, and always in the background.
  • What tech should I use? How do I build it? I need a proper blueprint and I need to look for the right components.

I should also consider the appearance. For quick reference, here’s the look of the device I’m going for:

Image taken from https://www.claytonbartonartist.com/?_escaped_fragment_=sci-fi-soldier-helmets.jpg%2Fzoom%2Fcx2j%2Fimage1qky
Image taken from https://www.designboom.com/technology/hiroto-ikeuchi-cyberpunk-wearable-technology-04-02-2018/?utm_source=designboom+daily&utm_medium=email&utm_campaign=surreal,+yet+functional
Image taken from https://www.deviantart.com/edsfox/art/White-fox-sketch-560197326
Image taken from https://twitter.com/archillect/status/695210207601893377/photo/1
Image taken from https://www.artstation.com/artwork/qNBgL

Other things…

I don’t have anything else that is super concrete yet. I’ll spend more time this week to work on it. :’)

3-Week Plan Ahead

  • Week 11
    • First thing I’ll do this coming weekend is to reorganise all the accumulated research and consultations to make sense of everything. So when Monday comes I can immediately start refreshed. I’ll also update my FYP page (timeline, keywords, concepts, etc) for my own good.
    • Start getting hands dirty again with Sound Watch prototype, including trying to get the appearance to look like a proper product. I’ll be using buzzers first instead of bone conduction to get the feel of the sound first.
    • Research on the science that apply to Sound Watch and use them when describing my project next week.
    • Consult prof when I’m done (probably Friday)
  • Week 12
    • Start working on Weather Sensograph (or Third-I or Money Sense)
    • Going through the same process: do the appropriate research and not get overwhelmed by information
    • Start working on presentation slides for interim presentation
    • Consult prof when I’m done.
  • Week 13 (Friday 13th presentation)
    • Start working on 3rd prototype (doesn’t have to finish)
    • Finish all slides
    • Rethink about concept and progress to ensure things are sound and for clarity

008 – FYP Progress

What’s done

  • First prototype of Anchorwatch (renamed cos vibrawatch sound weird) (very lo-fi)
  • 3D model of Anchorwatch (to be 3D printed)
  • 3D model of eye-to-eye periscope (to be 3D printed)
  • Very small amount of sketches to visualise Weather Sensograph, not enough concrete stuff to show everyone… :’)

Anchorwatch Prototype 1

Did a super ratchet prototype to test out and to quickly kickstart my work. I used a paper plate to cut into the shape of a watch, and used masking tape to put together the circuit.

Circuit consists of:

  • 1x Arduino Nano
  • 2x 3.7v 1500mAh lipo batteries connected in series
  • 1x switch
  • 1x vibration motor
  • 1x 220ohm resistor (to reduce vibration strength)

Code: just a simple if else statement with a millis timer. Also ignore the values I set for minutes and seconds, I was just testing and stuck with just 5 seconds for the variable “minutes” (just to prevent confusion (like every second is actually 2 seconds!))

This is definitely not gonna work, even though it works technically. Nobody wants to wear this, and it will definitely fall apart soon. I’ve made a 3D model and will start 3D printing tomorrow to make sure I have a stronger structure for the watch so I can start wearing it proper.

Not sure how it will fit my hand, will try it out asap. Also, I realised I forgot to include a part for the switch so I guess I’ll find a way……

Anyway, to recap the idea of this watch: It’s a watch that vibrates every 5 seconds (as of now, will adjust accordingly to see what’ll work during user testing aka myself), anchoring the user to learn the interval subliminally, allowing them to eventually time the interval without any aid or using of any tools like a watch.

Eye-to-eye Periscope 3D model

I’m gonna 3D print this as well so there is a proper and strong object to hold. I think I’ll just make a proper mockup of this so I don’t lose the idea even though it’s just a sensory experiment.

More ideas! More things to start…

Weather Sensograph

So I’m starting on a weather sensing device which I’ll call Weather Sensograph for now. The idea is to help us humans to have an innate weather-detecting sense, like how the Anchorwatch helps us in enhancing our chronoception. This is done through a wifi-connected device that uses weather API data (or any other reliable source of data pls tell me if you know cos I’ve only ever used API). The device uses information from the API and converts it into kinaesthetic feedback (the feedback I’m thinking will be appropriate… for now). This new sense will help us evaluate weather better, on top of just relying on our sense of sight, smell, hearing (which won’t be applicable when we are indoors which we are most of the time).

I explained the idea to Mark and he came up with a good point, that the device will help better if we can also detect information about weather that we won’t know about without the device, for example, where the rain is, and when will it rain. I think giving access to these information will be a more interesting approach to this device, but I don’t have much idea of how I can do this as of now as I’ll need to know the direction of the weather approaching the user’s relative standpoint and where the user is facing in order for this to work. So far, API (IF IM NOT WRONG) do not provide those info.

I think one way to do this is to have some kind of a seismograph type of feedback to weather to help us sense the weather’s intensity and direction. Mark suggests that I look up hair-tension hygrometer. (THANK U MARK U R GR8 HALP)

I think it’s this thing? Image taken from https://manual.museum.wa.gov.au/book/export/html/89

Sound Watch

The idea of this is to change the way we tell time with sound rather than visuals on a clockface. Recently I saw Aisen Caro Chacin’s Play-A-Grill and thought about bone conduction.

Play-A-Grill drawing. Image taken from http://www.aisencaro.com/play-a-grill.html

I’m thinking of mapping the hour, minutes, and seconds to different frequencies or sound cues, layering them, and using bone conduction to let us hear sound. This will help us in:

  • intuitively tell time just with sound cues
  • constant knowledge of time as the sound is always playing when one wears the device
  • Help the visually impaired

Imagine the application once everyone learns this new way of telling time. Clocks around the world can also be tuned to the frequency, played out loud so everyone can hear it (although this will be devastating to animals that rely on the sense of hearing as well as power supply)

Money Sensor

One of the more applicable ideas I have now is this. Money has become a necessity much like food. We can feel hunger and thirst but not how much money we have. This can be something worthy to explore, but I’ll work on the top few ideas first then I’ll come to this one once I’m done.

Third-I

Looking back at my Interactive Devices project, I think I can try to simplify it and make it part of this series of devices.

A common theme?

The ideas I have so far are only connected by 1 factor: A “I think they will be useful” statement. I haven’t linked it to the big picture concept which is the cyberpunk future. I think that will always be at the back of my mind but like I said before, I’ll just start making all these ideas a reality first and see what I can do afterwards. Actually these prototypes don’t take a long time to create so I can just keep making and think along the way.

What’s next?

  • Finish 3D printing Anchorwatch v2, finish second prototype and start wearing it; evaluate and adjust it along the way.
  • Work on Weather Sensograph, and then Sound Watch, then the money sensor
  • Continue ideating and read journal articles when I’m burning out from making. Also when I’m free or available to multitask, I’ll watch videos and sci-fi films.
  • Continue thinking about concept along the way and try to tie things together cos now everything feels very not unified
  • Start planning to interview people: I have this idea to interview or have conversations with people who wear implants or are interested in the topic of cyborgs, futuristic concepts, speculative design.

Recess Week Updates

Progress Thus Far

  • Done research & reading on Cyberpunk genre
  • Not done moodboard (rescheduling to another time to focus on, as it’s not important now after reviewing)
  • Sorted out and filtered journal papers that are relevant
  • Made eye-to-eye periscope, working on vibrawatch prototype
  • Read Sensory Pathways for the Plastic Mind by Aisen Chacin
    • Realised I’m doing something very similar to what she is doing, but with different intentions. She wanted to critique on the idea of sensory substitution devices as assistive technology, and want to normalise the use of these devices. For me, I want to go beyond normalising, I wanted to pretend that it is already a norm, and imagine what people do with it.

Next focus

  • Finish vibrawatch prototype
  • Sketching of ideas so I can present them and visualise them
  • Play with creating said ideas and test them
  • Read when I have time
  • Focus on sensory experiments and exploration with current few ideas

I have changed my focus from research to hands-on after talking to Bao and Zi Feng who gave me a number of good advices!

Cyberpunk

I went to research a bit deeper on cyberpunk genre. This is done through reading Tokyo Cyberpunk: Posthumanism in Japanese Visual Culture which, to be honest, was not very helpful as it was a lot of text analysing deeply on different films or animes that tackle different cyberpunk themes. What I found useful were the references to certain films or animes that I can search online for video clips or photos to refer to. This is speaking in terms of the aesthetics.

After reading the book, I realised there are more to the genre than just it’s aesthetics (I mean of course, just that I didn’t really think about it). I went to watch some videos on the topic and came up with some insights:

  • Cyberpunk genre revolves around themes such as mega corporations, high tech low life, surveillance, internet of things, identity in a posthuman world (man / cyborg / machine, where is the line drawn?), humanity and human the condition in a tech driven world.
  • The aesthetics is not a main point of the cyberpunk genre, as previously thought. It is all about the themes, that’s why many films that are “cyberpunk” may not have the iconic aesthetics of neon holographic signs, mega cities, flying cars, etc. Cyberpunk genre can exist in the era we are living in now and does not necessary have to be set in the future.
  • It might be more realistic to look at the current world and set the “look” accordingly rather than to refer to the 80’s vision of the future aka what most cyberpunk depiction look like.

So why are these info useful for me?

I think it’s good that I actually try to discover what cyberpunk really is and at least get a brief idea of it. I can try referencing certain themes that apply to Singapore and see what works best. How will Singapore be like if we are set in a cyberpunk world? This does not only affect the aesthetics of my installation, but it also affects the theme I’ll be covering, which can narrow my scope a bit more. In the end, I hope that the installation will be realistic and aesthetically convincing as well.

Also, I got my hands sort of dirty

I had the idea of a periscope that lets you see your own eye. This is more of a perception experiment than anything really important.

As a thought experiment, I was wondering what we will see, if one eye sees the other, and vice versa. Turns out, nothing special. You will see your eyes as though you see it in a mirror — except the mirror is not flipped. I can’t think of an application, but it is interesting to play with it. If you close one eye, you can see your closed eye with the other eye.

I show this to Zi Feng and Bao, and Zi Feng mentioned that if the mirrors are the scale of the face, you can see your true reflection with this device, with only a line separating the middle, which I think is an interesting application for a mirror that shows your actual appearance (how other people sees you).

But yeah, it’s not anything super interesting. I just wanted to start working on something haha.

Anyway, I’m also working on the vibrawatch, but it’s still WIP.

What’s next?

I’m gonna start experimenting with making sensory prototypes. I’m also starting on a device that responds to weather immediately after I’m done with vibrawatch.

007 – FYP Progress

Before we start…

I want to apologise for long post. I treat OSS as my platform for filtered train of thoughts. I’ll be updating my thoughts on my own Notion page as well, but I prefer writing here so I can write something properly that can be posted.

Progress Thus Far:

  • Finished reading Cybercognition (not useful)
  • Finished reading Sensory Arts and Design (useful)
  • Finished reading Design Noir: The Secret Life of Electronic Objects (very useful)
  • Watched Akira (not very useful)

Also, I’m tracking my own progress on Notion so I guess I’m opening it up to prof and friends here to take a look:

My ever-changing timeline

My ever-evolving reference sheet

Feel free to take a look and give comments!

Not completed:

  1. Moodboard
  2. Tokyo Cyberpunk
  3. Sensory Pathways for the Plastic Mind Book
  4. Speculative Everything
  5. Journal Readings

SO…

Reading helped me in getting a deeper understanding of the topics I’m covering. I’m clearer in what I want and what I don’t. I won’t be reviewing the books on this post, cos to be honest, some of them aren’t helpful at all and were skimmed within an hour.

The only few I found useful were a chapter  in Sensory Arts and Design talking about extra senses, and many interesting concepts, approaches, and works from Design Noir. I really like Design Noir as the approach to “noir” design taken by the book really resonated in me. I think it helped me in deciding how I will approach the different devices in my FYP.

I didn’t have enough time to read, quite expected. Anyway…

Throwback to 2 weeks ago I wrote that by the end of these 2 weeks, I will:

  1. Good grasp of the concepts I’m trying to use and understand how artists apply these concepts to their work. This is done through all the readings.
    • My answer: I did and I’m happy 😀
  2. Made enough observations and have enough ideas to start creating a proper picture and narrative of what my work will be like. This is done through 2 weeks worth of observation and ideation.
    • My answer: I guess I did but I’m not happy with how little I have thought of

So I can say, my goals from last week were completed, even though the tasks are not.

Some thoughts probably induced by coffee

Balance out humanity with tech: why do we prefer drinking coffee than to directly inject caffeine? Isn’t it more sustainable? But drinking coffee adds so much value to our humanity because it is satisfying to drink, it encourages social behaviour, the process is enriching to our souls

What can my products do to enrich people’s souls? Or at least, retain their soul?

Complex processes help us feel productive and fulfilled. Thats why musical instruments or tactile objects make us feel good. It must be tangible.

When we can feel something, we feel connected to it. Manual car vs auto car. When someone else does it for us, its great for us in terms of productivity, but it lacks a certain value. Why we love analogue and not digital.

Then, how can we simulate an analogous response in digital context?

Also, relevant to think about when our minds do code switching. When we are on digital interface, we want whats smoothest. When we are on physical interface, we want to work for it (only it it adds value)

To sum it up: if it adds value to our life, we want to do it the manual way. If it doesn’t, we want someone else to do it for us.

Need for stimulation and simulation in order for sensory replacement to work. can be subconscious, but must include active input from the brain. This can be seen from me dreaming of Overwatch. Also can be seen in how we bring our hand up to see time, or when we adjust our spectacles, or when we wake up to turn off an alarm. It. Must. Be. Conscious.

From this thought, I kinda know that my devices have to let its user actively do something, to work for the information they want to get. That said, Vibrawatch might not be cut out for this cos it’s very passive.

Updates to Concept

New Inspiration

As the weeks go by, and as I continue reading, I realised that my focus shifts from something more technical to something more conceptual. Instead of wanting to do something relating to sensory substitution and all the specific phenomena, I started thinking further about why I want to do that, and what I’m trying to get at. I started out curious about my alarm problem, then thought about how this can affect the way we use objects to integrate with our senses, to how we can imagine this type of technology in the future that can fit our needs.

I realised that sensory substitution or addition is only a supporting part of my concept (although I still want it to be something all my devices have)

For now, I’m interested in this thing called “Notopia”.

Notopia is a term mentioned on the first page of Dunne & Raby’s book “Design Noir”. I can’t find this anywhere except a few architectural article that defines it as:

“a consequence of the cold logic of market forces combined with a disinterested populace”, “Characterized by a “loss of identity and cultural vibrancy” and “a global pandemic of generic buildings,”

In the book, it is a state of a world where we are given the illusion of technology being the solution to every problem, ‘force-fed’ by ‘corporate futurologists’. Where, as technology develops, human behaviour continues to be controlled and predictable, reinforcing the status quo of things instead of challenging them.

This idea reflects, in my opinion, the reality of consumerism now; products designed to fit our needs, and to put it crudely, to pacify us. We value convenience and ask for products that help our situation in the status quo. Realistically speaking, this works because we can’t suddenly change our behaviours.

But looking in the future, how can this affect us? Can not doing this affect us?

In the book, the solution to Notopia is to subvert the use of daily products through hacking and abusing other qualities of the products that may not be intuitive on first sight. Dunne and Raby created “Placebo Project”, 8 devices that creates a placebo effect for people to feel comforted when being around objects that give out electromagnetic waves (which many people think are harmful to us). This is done through the subversive use of everyday technologies like lamps that switches on when near heat, or compasses embedded into tables that detect magnetic influences.

The project is done with a separate intention (to study “The Secret Life of Electronic Objects” in the interactions), but thinking about it, I think there is some relevance to re-imagining new ways of using existing objects.

I thought this concept is relevant to my project, but now I’m having doubts. So let me think this over, while I go through what I know for certain I want to do.

Properties of my concept

  • My current space is “cabinet of curiosities” styled room of a not-so-far future individual.
  • The inhabitant of the room is a Singaporean youth in the not-so-far future.
  • The aesthetics I’m going for is inspired by the Cyberpunk sci-fi genre as it fits the themes I’m covering (post-humanism). It will be the more “hyper-city” like type of Cyberpunk, rather than the grungy kind; more “utopian”.
  • There will be elements of Singapore hinted in the room that creates a sense of home, but also feels foreign.
  • The devices I’m making will be devices for the sensory-augmented humans of the future. They must work. They must be interactive (visitors can wear them).

Updated one-sentence description

[WIP]

Keywords I will be using

  • Sensory Substitution / Addition
  • Embodied / Embedded Cognition, Enactivism
  • Neuroplasticity
  • Speculative Design
  • Neuro Linguistic Programming
  • Anchors
  • Critical Design

Devices

  • I want my devices to be non-invasive
  • I want my devices to have sensory elements, either in augmentation or substitution that may or may not alter perception
  • My device must be related to a current-day anchor, like an evolved form of it
  • My device have to be interactive and wearable

Goals for next 2 weeks:

Recess

  • Read Cyberpunk Tokyo
  • Read journal article and scientific articles that may help expand (I know I’m doing too much reading and too little doing. I will watch myself)
  • Updating, refining of concept and clear arrangement of information.
  • Moodboard (really start on it!)

Rationale: I just feel like I might miss some information if I don’t finish reading what I set myself to read. But yes I will definitely watch myself now and not let myself go like the last 2 weeks.

To me, establishing a clear and good foundation is important. I want to sort everything out properly as I’m still feeling uncomfortable about some parts of my concept.

Week 8

  • I will have a clear idea of everything by now. So clear that I’m confident.
  • Create first prototype of whatever product I’m making
  • Start ordering things I need
  • Finishing up all research. (this doesn’t mean I’ll stop research, just that all the back-logged research have to be completed)

Goals by end of these 2 weeks

  • Talk to seniors, profs, etc for feedbacks on my concept and ideas
  • (maybe) start thinking of interviewing people that knows the subjects I want to cover, and also people who are using devices that substitute senses
  • I will get my hands dirty finally
  • I will be able to confidently tell people what I’m doing for FYP

Long-term goal (timeline)

When all my research and ideation is done, I will be going into prototyping. During October, I will be doing a lot of prototyping and testing, hoping to get conclusive results by the end of October to go into November AKA Phase 2 for me.

In phase 2, everything will be more urgent and critical. Real products will be made. Plans for grad show will be made.

In phase 3 (February), I should have most of the stuff I need ready and working. Here is where I start doing all the refinement, writing, exhibition, competition, admin stuff, etc done.

See my timeline on Notion for more details

006 – Week 6 Updates

Progress

Tasks completed:

  1. Devices That Alter Perception 2010 Book: Obtained relevant information and insights
  2. A Tour of the Senses Book: Obtained relevant information and insights
  3. See Yourself Sensing Book (Partially)

Not completed:

  1. Moodboard
  2. Tokyo Cyberpunk Book
  3. Sensory Pathways for the Plastic Mind Book

I would say I’m still on time with my readings, just got to keep going. I’m also making observations and writing notes down as I go.

Updates on Concept

Currently my concept is:

Conditioning from everyday activities creates new instinct for adaptation – Through speculative design, I intend to playfully explore sensory augmentation interactive devices, wearables, and prosthesis that inspire new adaptations to potential future scenarios stem from current-day routines.

Building upon this, I want to build a set alongside my devices, a room of a future Singaporean young adult that houses the devices they use daily. The idea is to bring visitors into my vision of the future of sensory augmentation, kind of like looking at a slice of what future youngsters will live like, and how our current actions affected their lives and the technology they depend on (one big example is climate change).

I envision the installation to be a room that contains local elements. Some typical ones are chou chou pillows, standing fan, tear off calendar, and many other humble, general items.

This may happen to your eyelashes if you don't wash your "chou chou"/"bantal busuk" - Daily Vanity
Chou chou pillow and blanket is a typical childhood item of Singaporeans
Nostalgic Items Tear Off Calendar
Tear off calendar is still popular in households

But all these come with some twists: the calendar may be a projection, the chou chou could have sensors in them, the room is filled with gadgets and futuristic objects, and neon coloured lights casted on the interior.

Futuristic Dystopian Apartment by Kamen Nikolov : Cyberpunk

All these are just a thought for now, I know that cyberpunk is quite an overused genre and to be honest, that’s not really how the future is headed (at least, not the aesthetics of the above image). So I will fine tune it to be more realistic to our current timeline.

Why cyberpunk: There are rarely depiction of Cyberpunk aesthetics in Singapore context, usually it’s Japan, Hong Kong, Korea. I want to explore that in our local context, taking cues from local references like HDB, chowchow, Merlion, food, objects, etc. Cyberpunk aesthetics are often linked to cyborgs, post-humanism, robots of the future, and it entails the type of devices my concept bring about.

There are more to think about for this idea, cos I definitely need to design the set so it looks convincing. But as of now it is like this.

Summary of this new part and why I think it adds value to my project:

I want to use this room to immerse my viewers in this new world I’m building (futuristic, somewhat cyberpunk, relatable to us), with the new devices I’m making that can convince them that they exist for a good reason. In this way, I’m still going with my “cabinet of curiosities” idea, except that it is in a themed room.

Update on Ideas I want to explore (in terms of devices):

  • Non-invasiveness: All my devices must be non-invasive. This is because its easier to present them, and I don’t get into trouble. Plus I’m also not comfortable with making invasive devices
  • Vibrawatch: The watch that vibrates to tell time
  • Personal space bubble that tells their user of their surroundings using tactile responses
  • Weather detection: Based on an old wives’ tale on how they can predict rainy weather when they feel pain in their knees due to arthritis, a device that is attached on users’ joints that responds to weather changes based on API data
  • Money sensing: A device that translate bank transactions into a sensory organ
  • Self-Anchoring Device: as my concept is about anchors, I think there can be a device that let its user create their own anchors
  • Seeing your eye with your other eye: Not related to my concept actually, but just curious what if I build a device that connects one eye to another eye using a tube and mirrors. What will we see lol
  • Gas Detection: Our nose can’t detect certain gases. A device that uses certain feedbacks to tell us of dangerous gas proportions especially relevant if the future is all pollution
  • Home-Body Integration: We are always fantasising robots doing chores for us, what if we have devices that integrate us into the house where we manage our home from?
  • Anchor Alarm: An alarm that teaches us to anchor to waking up. Related to circadian rhythm, perhaps fixing it to be more modern than primitive (the idea is that circadian rhythm sometimes limits our lifestyle, can we create a new artificial circadian rhythm?)
  • Health Diagonsis: An organ that detects degrading health

Book Reviews

Review of Devices That Alter Perception 2010:

Book has good amount of relevant concepts that I can reference and take inspiration from. However, as these concepts are from 10 years ago, I wonder if there are any newer sense-altering concepts done nowadays. I am worried that this have been done too many times before that it becomes “old” and uninteresting. Still, these are very good information for me and I am thankful that this book exists. (also great that the book endorses low-fi prototypes that work)

Some interesting artists and their concepts that I’m drawing reference from:

Ong Kian Peng: Objects For Our Sick Planet:

  • Flood helmet: Contextualise the future of floodings due to sea level rising. Water level changes base on GPS location. Also adds weight → pressure on the wearer’s face.

Susanna Hertrich, Gesche Joost: Automatic Anchoring Armour

  • Bio feedback and mental conditioning.
  • Similar to anchoring technique used in Neuro Linguistic Programming (NLP)
  • Taps repeatedly in the same spot on the forearm to trigger positive emotions when feeling stressed
  • Anchoring, therapy, Instant therapy to cultivate sense of calm in anxious situations
  • Critique point, and also emotional pacemaker

Susanna Hertrich: Synthetic Empathy: Somaesthetic Body Actuation As A Means of Emotional Evocation

  • Novel means of emotional evocation: somaesthetic body actuation
  • Design to help us understand societal issues from critical point of view
  • Connect emotional reactions of bad news but actuated by computers and machine
  • novel ways of bodily actuation, low-tech prototyping as exploration for new ways of interaction
  • Uses sensory substitution as an add-on concept to more conceptual part of their work
  • Using low-tech prototypes to explore new ways of interaction

Devices That Alter…. A Potted Inquiry by Danielle Wilde (bolded because super relevant imo)

  • Artificial organs to improve our senses
  • Art to bring cyborg aesthetics and transhuman concerns into everyday life
  • Step away from extreme artists like Stelarc or Orlan, we see how our lives have already integrated cyborg aspects like people wearing cochlear implants or prosthetics. EEG mind controlled games also exists today.
  • Perception altering devices are more present in arts than in tech or irl
    • Lygia Clark: Sensorial Hoods (1967)
    • Walter Pichler: TV-Helmet (1967)
    • Haus-Rucker-Co: Environmental Transformers (1967, 1968)
  • Wild, entertaining, stimulating, provocative proposals aimed at prompting us to reflect on the future we would like to live in.
  • If we gain a body part, our neural map expands accordingly
  • If you can have wings, you would develop a winged brain. Our bodies change our brains. It is infinitely mouldable
  • But: Does our neural network map expand when we wear temporary devices that alter perception? How much wear or use is required to shift what our brain consider to be norm?
  • “Why” is a question people always ask when confronted with wearable devices that alter our perception
  • So why do we continue along this pathway of creating devices that alter perception? What drives the artists and designers, engineers, and tinkerers? What use can the device possibly be?
    • Do we want something thats going to be “neat” for only 15 mins, or something that will permanently enrich our lives?
    • Are propositional devices enough to raise provocative questions?
    • Do we need to make the objects and experiences being proposed?
  • Good design doesn’t just look beautiful, it acts differently and makes people who use it act differently. How can we then discern what makes a DAP well designed?
    • Know the constraints within which you are designing
    • Be empathic – think seriously about the way you feel and the way that people around you might feel and use it as inspiration
    • Challenge precedence, break through the way things have been done
    • Prototype your idea, try it, put it out there, test it
  • Qns:
    • What constitutes usefulness in the context of DAPS?
    • Is design for debate, in relation to DAPs, effective, and thereby useful?
    • What defines a DAP?
    • How do we evaluate such devices?
    • How can we bring rigorous methodologies to their development, evaluation, and distribution (whether the final embodiments are prototypes products or scenarios?
    • How can designers, artists, theorists, makers, engineers, technologists cross-fertilize in meaningful ways and thereby enrich their enquiry and ultimately the DAPs they are developing?

Tomoko Hayashi and Carson Reynolds – Empathy Mirrors

  • Using experiences as a mirror for emotions.
  • Twin cue

Alvaro Cassinelli – Earlids and Entacoustic Performance

  • Earlids as if it is like our eyelids
  • The body and its sensory organs always modulate the external sound field in one way or another
  • Unconscious processing of the environment in the body → becomes apparent to the user
  • User relearns the new artificial auditory sensory-motor contingencies
  • Interesting: How do our bodies react to different stimulis

Review of A Tour of the Senses:

A Tour of the Senses is kind of like a textbook that explains different aspects of senses broken down into 3 main chapters: stimuli, sensation, perception. It breaks down all the different senses in different categories (Electromagnetic, Chemical, and Mechanical) in a clear and informative manner. The types of senses covered are:

  • Vision (Sight)
  • Olfactory (Smell)
  • Gustatory (Taste)
  • Auditory (Hearing)
  • Somatosensory (Touch, Temperature, Pressure, Pain)
  • Proprioception (Space, position, location)
  • Vestibular (Balance)
  • Nociception (Pain)

The book explains what each of the senses are, and how they work. Then, how we perceive through the use of these senses. The book also give various examples of how we use those senses; as well as animal examples like how snakes use the Pit Organ to sense in Infrared.

The book also mentions interesting points like the cochlear implant, transduction between stimuli and sensation, and how perception is different based on culture and education. This helped me to understand our senses better, which gave me more confidence in exploring those senses in my concept.