Milestone #1 – Sound & Motion Control (Mark & Chris)

Our first step is to sync Leap’s gestures with Ableton Live. One way was through Max Live input in which it allows the user to trigger individual tracks and control the volume.
The patch is then connected to another patcher through udpsend and receive over the same port which triggers which tracks and clips to be played in live.  Taking the output of the velocity from the notein~, it is then transmitted to the input of the random number generator which powers the fractal noises as shown below., thus generating the particle visualization.

To combine Leap and Ableton Live, we went through a program called GECO MIDI which maps hand gestures into directional buttons which are fed through Ableton mapping keys.  These keys  are able to control anything and everything

Interactive Hologram (Mark and Chris)

Due to recent trends of VR and AR, Mark and I wanted to
achieve interaction with a hologram.

Leap Motion – Hand gestures

Max MSP – 3D interaction and sound
Vvvv – 3D graphics
Mac monitor & Acryllic sheet – hologram

How are we going to achieve this?

Leap motion transforms the hand gestures into coordinates which will then be implemented into Max MSP. With that, the coordinates will then influence a 3D model/object ‘s rotation, scale and movement.  Sound input will also be coordinated and inputted in max. Last but not least, we plan to explore Vvvv for 3D graphics generation to incoporate into the hologram.