Extracting coordinates from Synapse and linked it to Max MSP
Changing from mouse control to Kinect coordinates
Particles initially tracked only on X and Y axis of window, but debugged to follow joint coordinates to move freely around the screen
Fine tune particles to follow joints more smoothly
All joints initially mapped to the left hand, leaving particles only connected to one joint
Converting jit.gl.mesh into a matrix
jit.xfade displays patches in separate windows but managed to sync individual particles into one window
failure to interpolate Synapse’s Kinect’s depth mode and using it as a background mask layer – but created an alternative solution of background visualiser
Audio-visual react – connecting Ableton to Max
For Max MSP, we managed to overcome several challenges that popped up time to time obstructing our process. Firstly, a block was being unable to connect the coordinates of the particles to the Kinect movement instead of ‘prepend mouse’. We managed to solve that by removing an argument in the code. Another main milestone was to be able to project all joint’s particles into one window, where initially it would open up in separate windows. Right now, we are currently in the stages of fine tuning the aesthetics and fine tuning the triggers of the audio.
Ableton Live
Getting base tracks to work with commands dials
Improved auditory tracks
Fine tuning movements and audio react
For Ableton Live, it was a high level challenge to experiment with a new music software in just a short time period. Essentially, this project also requires music making and DJ skills in a sense, where cutting of tracks and beat matching is the key to making it work. Also, genre also played a major part that affects the entire project, with its varying tempo, sounds, and methods of triggering its layers. (Instrumental backing track, melody, beats etc). We overcame our first hurdle of finding fitting tracks and music cuts. Right now we are working on its seamlessness and gelling the suitable sounds to an action triggered by the player.
Main Challenges
Producing suitable background tracks and music cuts
Users move their heads from left to right to control the rate of twerking
Glitches
I wonder if anyone else also faced this issue but the lighting of the face tracker video glitches harder in certain lightings.
Previously after tweaking the values in scale, the video was able to run smoothly. But after setting up my laptop in a different setting, the twerking video looks like the frames are staggering
Improvements
It is better you use a video with linear movement. I’ll definitely keep that in mind for the final project and experiment more with the mechanics to minimize glitches for our final project.