This is the video documentation for our final project presentation. We were glad that it invoked responses from the people that were watching the interaction happening between the installation and its participant. Many people were taking videos and photos of the participant’s actions. This was the ideal scenario we wanted to achieve. The interaction between the installation and the participant and another form of interaction between the bystanders and the participants.
Memories of sound is an interactive sound installation that explores the relevance of sound in triggering human memories and experiences. We hope to understand and explore the connection that humans have with sound.
The users will be wearing colored bands on their hands which serve as the trigger points of the interaction. The movements of the colored bands will be detected by an external camera that is linked with the Max 7 patch. It will prompt changes in the audio output supplied to the users through the headphone. The change in the audio will allows the users to experience different sound context. We hope that this will trigger their personal interpretation of the sound available to them.
We tried testing our interactive installation at the concrete wall. Initially we used light sticks as the light source. We took screenshot of the colors of the light stick and got their values on Photoshop. We input the values into Max so it can detect the colors. However, once we were in the open area, the camera cannot detect the light stick. It was because the light source from the open affected the readings. In the end, we used colored paper instead.
It was fun testing out the installation ourselves. Through the testing we determined the optimum distance the person had to stand from the camera in order for patch to work. We also tested out different combinations of sound to understand how it will affects the user trying out the installation. We just hope it will not rain on the day of the final presentation.
The following are the screenshots for the patch we did for our interactive installation.
The following is the video documentation of us trying out the Max patch. We used light sticks and several colored materials to test out the patch.
The main issue for our patch is the webcam not detecting the colors at times. Therefore, it was quite frustrating as we were not sure at times if it was the fault of the patch or the fault of the camera. To put an end to our problem, we borrowed an external webcam that is more reliable, so the detection rate of the colors is more consistent.
For our final project, we decided to shift away from the idea of creating an interactive installation depicting the experience of walking in a cave in a dark room. We feel that the cave experience is cliche and does not add meaning to our project of using sound to create an experience.
Instead, we decided to experiment using sounds to create a narrative experience for users. For example, the sound of wind can means a storm is approaching, or it can mean a breeze from the sea. If another layer of sound is added to the context, it will become an entirely different meaning. We feel that it will be interesting to see how people react to different sounds and how they feel about it.
Therefore, we decided to experiment with different sounds to create a different context for the users. We will be blind folding the users experiencing our installation so that the impact of sound can be enhanced. The users will also be wearing headphones to prevent distraction from the surrounding sound. This allows them to experience only the sound and nothing else.
After looking around, we decided to use the space at the concrete wall in Adm for our installation. The uniform color wall will serve as an excellent background for our color tracking sensor of our Max patch. The open space will also invoke interests from passer-by when the users are interacting with the installation. We also tested out the volume needed to be input into the headphone to cancel out the surrounding sound.
As our project required the use of color mediums and body movements to trigger audio outputs as a form of interaction for our planned interactive installation, we tested out some patches to look for various possibilities.
Triggering of audio outputs by recognizing the position of the held object (Yellow object) on the screen. Different audio output is created when the object is at different position.
We will be creating and exploring different audios suitable for our interactive installation since we solved the issue of triggering audio output.
We feel that the audios we assembled should provide relevant information for the users to figure out the context of the space by themselves instead of informing them in the first place. We hope that this form of information output will invoke interest from the users and causes them to explore the space around them.
After much discussion and refinement, we have decided to create a cave experience through the use of audio in a dark room setting. A confined dark room is a perfect metaphor for a cave. Each participant will be given light sticks to placed on both his/her ankles. These light sources served as trackers for the cameras put in the room to trigger cave related sounds through the notion of walking around the space. We feel that it will be interesting to explore the interaction of people with a place when the element of sight is cut off from them. We are planning to have additional cameras set up to allow the people outside the installation space to view the interaction happening inside the space. This is to allow participation from the people outside the installation space.
Four players will interact with each other as well as the computer to create sound in a dark room. There will be a total of 4 computers, each with four colors corresponding to a certain instrument or beat. Players have to mover near to the computer to trigger faster beats and further to trigger slower beats. The player can switch lanes to trigger a different instrument on another computer.
We will be researching on patches that can track colors to trigger audio output as a starting point for our project.
This is one of the videos we found on that talks about tracking specific color that will lead to sound.
*Update* We tested out the patch, and it works. The reading was not very accurate because there are a lot of surrounding light that affects it. Once we have a more refined patch, we will be testing it in a place that has limited light.
28 March 2017
Once we get the color tracking system to work (hopefully), we will be exploring and experimenting with a different combination of sounds.
The following are some of the sounds we are looking into.
We feel that using everyday sound to create music is an exciting area we can explore.
Chinese classical music is another aspect that we feel is an intriguing area that we can work on.
For our project, we are looking into experimenting with sound. We feel that sound is an interesting element that we constantly overlook due to its abundance existence in our life. As sound is a non visual element, We feel that it is a favorable circumstance for us to use our other senses other that our sight which we so depend on, to feel and appreciate the spaces around us. In the process, we hope to gain fresh realization on our reaction of sound by our body. Therefore, we want to use this opportunity to have an interaction with sound free of any outside distraction. We want to find out about the significance of sound and its purpose in our daily life.
The following are some videos that give a rough idea on the type of interaction we are looking into and hope to achieve.