Extracting coordinates from Synapse and linked it to Max MSP
Changing from mouse control to Kinect coordinates
Particles initially tracked only on X and Y axis of window, but debugged to follow joint coordinates to move freely around the screen
Fine tune particles to follow joints more smoothly
All joints initially mapped to the left hand, leaving particles only connected to one joint
Converting jit.gl.mesh into a matrix
jit.xfade displays patches in separate windows but managed to sync individual particles into one window
failure to interpolate Synapse’s Kinect’s depth mode and using it as a background mask layer – but created an alternative solution of background visualiser
Audio-visual react – connecting Ableton to Max
For Max MSP, we managed to overcome several challenges that popped up time to time obstructing our process. Firstly, a block was being unable to connect the coordinates of the particles to the Kinect movement instead of ‘prepend mouse’. We managed to solve that by removing an argument in the code. Another main milestone was to be able to project all joint’s particles into one window, where initially it would open up in separate windows. Right now, we are currently in the stages of fine tuning the aesthetics and fine tuning the triggers of the audio.
Getting base tracks to work with commands dials
Improved auditory tracks
Fine tuning movements and audio react
For Ableton Live, it was a high level challenge to experiment with a new music software in just a short time period. Essentially, this project also requires music making and DJ skills in a sense, where cutting of tracks and beat matching is the key to making it work. Also, genre also played a major part that affects the entire project, with its varying tempo, sounds, and methods of triggering its layers. (Instrumental backing track, melody, beats etc). We overcame our first hurdle of finding fitting tracks and music cuts. Right now we are working on its seamlessness and gelling the suitable sounds to an action triggered by the player.
Producing suitable background tracks and music cuts
Our project focuses on creating sound and visuals synced to body (mainly hand) movements. Using a Kinect as an input, body tracking movements are detected and communicated between Ableton Live and Synapse to create dubstep tunes. As for the visuals, particle movements are projected as they trail the gestures of the users.
Our interest lies in exploring movement and effecting visuals as an outcome. We would also like to explore the music aspect MaxMSP. With these interests, it motivates us to create an interactive and aesthetic final work that engages them in an overall immersive experience .
Interactivity: Live music creation based on hand movements, visuals reacting to movement
Create abstract artwork and music
Goal: Explore the relation of movement, creating music and artwork through space and spontaneity
Synapse with Kinect, linking MAX with Kinect (detection of joints)
Explore particle system in Max
Create sounds with Ableton
Link particle system to Kinect – rough physics
Sync sound and visuals
Toggle visuals for MAX-Kinect, more refined
Linking of the sound and visuals for the installation
Setting up of the installation
Finalization stage and refining of artwork
Testing of interactivity, proof-checking
Change in the gen make a new parameter? Put param – make yourself a new parameter xy