My reference for emulation “The V Motion Project” – it was a collaboration between many creative people working together to create a machine that turns motion into music. The client for the project, Frucor (makers of V energy drink), together with their agency Colenso BBDO, kitted-out a warehouse space for this project to grow in and gathered together a group of talented people from a number of creative fields.
The interface plays a key role in illustrating the idea of the instrument while the main highlight showing the dancer controlling the audio. Design elements like real time tracking and samples being drawn on as they are played, all add to authenticity of the performance.
Live visuals were created and designed with a music video being produced from the results. It was clear that the technology was real and actually being played live. The visuals are all created live and the music video is essentially a real document of the night!
For my emulation project, I will illustrate part(s) of the interaction involved in the artist’s work. The emulation will involve the use of Kinect and Synapse (gets input data from Kinect and sends to MAX/MSP).
FYI: Tap/Push your right hand forward to stop/play track, rotating your right hand clockwise plays the soundtrack loop forward, counter-clockwise plays the segments backward and holding/staying in a position causes a beat repeat effect.
Using jit.hue, mapping right hand position retrieved from Synapse to a hue angle to change the color of the music waveform randomly for a particular position/cursor. Currently playing slice and guides showing tempo are also mapped with the kinect silhouette. To add to the visuals, by taking the audio output, and averaging the signal, it’s being used to control openGL jitter visualisation ☺
Wendy Ng (IEM/4)