Brief introduction of our project Sensorium:
“Sensorium” is a installation aimed at letting people experience Synesthesia – the interstice between senses.
How our project works:
3 normal every day objects will be placed on the table around a central fish tank. Participants get to interact with the objects by picking them up, and this in turn causes 1) a disassociated sound to play and 2) a colour of ink to drop. Each sound and colour will be specific to the object (specifically not related!).
The final outcome will be a bunch of sounds playing in the background and a murky tank which represents synesthesia as an overload of senses.
1. Numerical Representation
This means that a new media object will have a digital code, mathematical functions and algorithmic manipulation.
For our project Sensorium, we have digital codes that translate light into values through the use of a photo resistor. As the participant lifts up the object, our LDR senses a value and sends it over to the Arduino. The Arduino then reads the value and does two things! If the value of light is greater than our fixed value (in our case 430), 1) it sends the information over to Processing and emits a sound. 2) it sends a signal to turn our solenoid on, which then causes ink to drip into the tank.
Each piece is stored independently allowing it to be manipulated and modified separately and at the same time.
Modularity is found in Arduino code we write. We have different instructions, different functions like digitalWrite, delay etc that when you put them together they run. And they can be modified independently as well.
Hardware wise, our project is modular because we can change the input, for example a switch, pressure sensor etc. But we chose to use an LDR, which makes the input modular because despite changing our input, the outcome will still be the same. (ink and sounds)
Also, we can choose to change our different objects (for e.g sitting on a chair vs picking up an apple) and this would still keep the outcome the same.
The new media object is able to generate certain effects without human interference. That means you can leave your installation there and it is still able to perform without human interference. I.e. programme has some sort of intelligence on its own.
Once our code is loaded into the Arduino, we can just sit back and enjoy because the Arduino takes care of everything – from producing sound to making sure the ink drops! (Thank you void loop()!)
This builds upon automation. In that how new media objects are never fixed and different contexts/ different spaces affect the outcome.
This means a new media object would not be cast in stone, for e.g if you were to go to a photograph exhibition, nothing changes. That is because old media doesn’t change, but new media does. New media changes over time.
Our project offers a fairly wide extent of variability to the viewer. First of all, depending on which object is picked up first (the order), it changes the order, and hence the result of the sounds heard. It changes the order in which the ink is dropped as well, thereby causing the colours to mix a little differently. This creates variability albeit very little. Also, depending on the duration the object is picked up for, the sound will be played for either longer or shorter, and the ink will drip either more or lesser.
A method or process that changes one object to another, or a movement of data between formats. There are two layers to an object: a cultural layer and a computer layer. we are taking (what we encounter in our everyday life) concepts in the cultural layer and we computerise it e.g scanning an image, it becomes from a photograph to pixels. With all these cultural artefacts, once it goes through computerisation its meaning changes. There’s a blend of human and digital.
Like how we take physical object in everyday life like a hi fi, and transcode it into a digital form like Winamp (theres a connection between physical object and computer interface), making it familiar for us.
We are using daily objects in our project and people will culturally react to our objects. For example, when someone sees a chair they will be inclined to sit on it, when they see a block they would want to pick it up, and this is then translated to the computer and it produces different results through our coding, i.e. the sounds played and the ink dripping.