Final Project Milestone

 

 

 

Project Basis: Live visuals + sound + movement

Interactive II

Team members: Valerie , Gladys, Siew Hua

 

Our project focuses on creating sound and visuals synced to body (mainly hand) movements. Using a Kinect as an input, body tracking movements are detected and communicated between Ableton Live and Synapse to create dubstep tunes. As for the visuals, particle movements are projected as they trail the gestures of the users.

 

Our interest lies in exploring movement and effecting visuals as an outcome. We would also like to explore the music aspect MaxMSP. With these interests, it motivates us to create an interactive and aesthetic final work that engages them in an overall immersive experience .

 

End Goal

Interactive Installation

  • Interactivity: Live music creation based on hand movements, visuals reacting to movement
  • Create abstract artwork and music
  • Goal: Explore the relation of movement, creating music and artwork through space and spontaneity

 

21 March

Synapse with Kinect, linking MAX with Kinect (detection of joints)

Explore particle system in Max

 

28 March

Create sounds with Ableton

Link particle system to Kinect – rough physics

Sync sound and visuals

 

4 April

Toggle visuals for MAX-Kinect, more refined

Linking of the sound and visuals for the installation

 

11th April

Setting up of the installation

Finalization stage and refining of artwork

Testing of interactivity, proof-checking

 

18th April

Submission

Change in the gen make a new parameter? Put param – make yourself a new parameter xy