Team: Darryl Lim, Issac Ting, Nathanael Goh


For our project we wanted to take our audio visualisation to another level, and we came upon The V Motion Project.

Similar to our initial project and The V Motion Project, we wanted to use our movements of our whole body to trigger certain sounds and visuals displayed on a large screen/background.

However, we came upon this video on Vimeo called Sonos Playground Deconstructed – Museum of the Moving Image.

We felt that this was more intuitive and immersive in terms of the space as we could use the gestures to control the visuals to go up, down, forward or stop. And with the base visuals and audio, viewers/participants can create their own mix of audio visualization using their gestures.



For this project, our system uses our gestures to manipulate the audio and visuals that will be projected onto our location. The gestures mainly include hands moving up, down, forward and stop in midair.



For the next three weeks, we have split the project into three sections, in which one section will be done for a week.

First week we will be focused on getting the base visuals, and the visuals that will be triggered by our gestures. Also, we will be working on the interface on how to capture the gestures to trigger off the correct visuals.

In the second week, we will link the visuals to the interface and fine tune the two components. Also, we will add in the base audio and the audio that will be triggered by the respective gestures.

In the third and last week, we will be going to our location to try and set up everything and to work on the issues that would arise for our interface in our location.



As of now, we intend to situate our project at the level 2 narrow corridor near the handicapped lift in ADM. With the projection facing away from the window.

We aim to set up the corridor in a manner similar to the video above, draping translucent cloths from the top down and the projection will be projected onto the cloth and also pass through it to be projected onto the back wall itself.

So while the participants are engaged with the projection through our interface, we can also see the interaction between them and the cloth setting.

Leave a Reply