The objective of this project is to let people be aware of how dependant we are on our sense of sight. Thus, this project will make participants walk around without their sight and rely only on their sense of hearing and touch alone.
They will be blindfolded and be put into a space with sounds being played from 4 speakers. All speakers will be playing the same sound file from each corner.
To guide the blindfolded person, a friend will have a gyroscope as a controller for the sounds from the speakers. The person blindfolded will be heading towards the sound that is being controlled by the friend.
Objects will be attached at certain areas of the space to give the blindfolded instructions on how to go about in the space. Eg., crawl, side walk.
Adapt/incorporate something from Don Ritter’s Intersection
As of now, using patches from ambisonic with Teabox on max msp, i was able to ‘move’ the sound around 4 speakers and 2 sensors – bend and distance sensors are used to toggle between different sound files.
Instead of being controlled by someone, will be trying out with ‘directions’ being given by random numbers. With that, the experience will be less controlled.
Furthermore, with the use of a motor and a sensor attached to it, this would be able to act like a ‘sentry’ such that when in contact with anyone while it is rotating along its axis, it will trigger another sound which would trigger you to do another action.