IM showcase video
IM showcase gallery
Sensorium questions the interstices between our senses that we often take for granted through the sensory phenomenon known as “Synesthesia” – a condition where a stimulation of one sensory or cognitive pathway leads to automatic and involuntary experiences by other senses. In short, an experience of a single sense is now simultaneously perceived by other senses resulting in confusing outcomes for the individual with this condition.
In this installation, we aimed to make this gap more aware by recreating this condition by allowing visitors to experience a similar disassociation between their senses by creating a sense of unfamiliarity with objects that they are familiar with interacting. This “sensory overload” is created through the unfamiliar and different sounds heard and coloured “ink-drops” seen dripping into the tank when interacting with the objects on the stools.
Concept – artist references
We were also inspired with artworks that engaged different senses and emitted sounds. Through interaction some of our references are as follows:
Lenses by Hush
A installation that converts light sensors and refractions to sound.
Another installation that plays on the idea of engaging different senses in the form of Synesthesia.
Objects used and sounds that corresponded:
- Box with Lid (with flowers on the inside) – Thunder
- Chair – Cat meowing
- Light switch – Toilet flushing
Characteristics of interface
As Sensorium is ultimately a participatory-driven installation, the viewers are in integral aspect to this installation. Thus on the continuums of interactivity, we would place “Sensorium” close to the zone or High Interactivity, where the viewer’s actions and feedback will ultimately determine the outcome of their experience with the installation.
For each object, we had to instruct our code differently based on the values read from the LDR (SensorValue). For instance, for the sitting on the chair, our requirement was that only if SensorValue <= 50 would the condition be true and hence turn the solenoid and sounds on. This meant that only when someone sits on our chair and the light value dips below 50 will the condition be true. For the other two like opening the box and flipping the light switch, our condition was if SensorValue >= 100 because only if the LDR senses light, i.e box opening and also light turning on, will the condition be true, and the solenoid and sounds turn on.
We used what Lei thought us about Serial.println( ) in our code! Each object had different integers for Serial.println(_), as a form of communication between Processing and Arduino. Basically if for an object sensorValue>=100, Serial.println(L) is true. This integer “L” is then sent over to Processing and plays the chosen sound. We used Processing instead of an MP3 Shield because Processing could simultaneously take in three codes from Arduino and yet play all three different sounds at the same time. We made use of the minim library on Processing to play the sounds easily!
some challenges and how we overcame them
To add on, we also ended up choosing objects that were more dynamic (opening box and flicking light switch), rather than just purely the action of picking up and putting down.
INDIVIDUAL ROLES AND REFLECTIONS
Jonathan – Head of Setup, Logistics and Concept
I think there weren’t huge challenges faced in this project but rather many small glitches and problems that occured throughout the process such as the technical aspect of getting the sounds to work. I was initially supposed to handle that area and we resolved to using a MP3 shield to play the sounds provided for the installation. However, the MP3 shield did not work alas due to some faults in the hardware and software. We decided to use processing to solve the issue in the end as Nasya had found a method to utilise it for our project.
There were many hiccups in the set ups as well. From the parts of getting the LDR to work during the set ups and how the droppers would actually run out of ink quite often.. but generally I think we were able to work around the limitations and created a very interesting and fun experience for our viewers. We were cracking our heads to come up with a strong disassociation between the objects and sounds but realised that an association could always be created regardless and that is actually a human condition as well – the tendency to draw connections and create associations. The experience created by the objects and sounds added a dimension of humour that we didn’t think it would bring and I thought that was quite interesting. 🙂
Daryl – Head Hardware, Arduino Technician, Aesthetic Advisor
When we first started the project, we were bent on created a big and extraordinary auditory visual experience in relation to synesthesia. However as we progressed through the project, we learn that synesthesia is more of a day to day experience which synesthetes have. Thus we worked towards the idea of giving everyday objects a different response in dissonance to the objects in question which we finally chose, the chair, a box and a light switch.
We encountered many little hiccups during the conceptualisation of Sensorium. Problems such as circuitry issues (we almost fried Nasya’s Macbook), programming issues, and also a lot of debugging be it in the software (Arduino, processing) or hardware (droppers, solenoids, mechanisms, we had to find the correct inks to use too).
We completed the project in the nick of time, and we were so happy that it all came together at the end. When our audience were testing and playing with Sensorium, creating the sounds and Ink clouds simultaneously it almost blew our minds. It actually turned out better than we expected. I feel that Sensorium has fulfilled its purpose: to create dissonance in everyday objects and their expected responses and thus portraying what a synesthete could potentially experience in his or her daily life.
Nasya – Head Programming, Processing Maestro, Arduino Extraordinaire
Overall the project felt like one very smooth journey! Each member owned their role and as such Sensorium was pieced together very nicely. I was quite amazed at how far 13 weeks got us, from knowing nothing about Arduino to being able to code according to what our project required. I remember initially it was very hard to code stuff due to just unfamiliarity, but as the weeks passed, it was easy to grow more accustomed to the coding language and be able to get Arduino and even Processing done. Here’s some work-in-progress! Could really see the improvements coding wise 🙂 A lot of the final codes were adapted, improvised to suit our needs and based on earlier codes that we learnt from class and from the Arduino Project Book!
Here’s our code from the initial buzzer and LDR adapted to fit the solenoid!
Here’s towards the end when we realised we needed a way to play the sound together with the solenoid movement thereby replacing the buzzer. Managed to get it to work with processing and we were ultra excited!
It was great that there was a progression, a growth toward our code, in that we did not suddenly write out a code overnight but rather it was based on looking through our code weekly and tweaking them to suit our project needs. Overall because of consistent work we managed to do the project well!
Lastly as a bonus, here’s the behind-the-scenes/ inside-the-box of Sensorium.
Thank you for reading 🙂
Pang gang lo~