Monthly Archives: April 2017

Interactive II Final submission (18th Apr)

This is the final result of week’s worth of work. We’ve titled this work Psychedelia.

Notes:  Users interact by moving colors in front of the camera. As you can see in the setup. The color affects the type of music that is played, and the music
affects the live camera footage. All the audio you’re hearing in the documentation video is captured live.

 

 

Additional links:

21st Mar   28th Mar   31st Mar   4th Apr   11th Apr 14th Apr

Interactive II Progress (14th Apr)

Position and scale

 

rotation

secondary webcam add on

we improved on the previous visuals by introducing rotation and scaling to the images. We’ve also added a secondary camera so the user would see how they are affecting the camera. We also edited the audio in Premiere so that the loops are shorter but seamless at the same time.  The following clips are the final audio.

Additional links:

21st Mar   28th Mar   31st Mar   4th Apr   11th Apr 14th Apr

Final submission

Interactive II Progress (11th Apr)

RGB values based on position and area

RGB separation results

Amt of red changes pitch shift

Amt of green will change saw and cycle notes

amt of blue changes types of drum loops being played

After getting some feedback about our work, we’ve decided to change a few things. First of all, we’ve decided to use the amount of color on the screen, rather than the user’s face to change the ambient music. The reason for this was due to the fact that Jit.faces works intermittently, and required the user to maintain full frontal eye contact. Instead, we decided to use cv.jit.blob.centroids.

We’ve also decided to change the individual length of the music tracks and make them distinct so they are aware that they’re affecting the installation.

The amount of green would change the saw (low pitch ambient sound) and the higher pitched cycle notes, the amount of red would change the pitch and its lateral position will play a different soundtrack. Finally, the lateral position of the blue region would change the type of drum loops that will be played.

 

This was the result of the following patch.

Additional links:

21st Mar   28th Mar   31st Mar   4th Apr   11th Apr 14th Apr

Final submission

Interactive II Progress (4th Apr)

Overall patch at work

position patch

 

face proximity

We’ve decided to add cv.jit.faces into the patch to detect the user’s face position and scale in relation to the screen to affect the type of music that will be played. We’ve improved the patch up so that the overall contrast and saturation of the displayed image will be increased depending on how close the person Is to the screen, rather than a randomiser.

Also, the lateral position of the user’s face will change the type of music will be played. I took some slightly altered SYNTHI 100 music and edited the clip so that it plays on the loop seamlessly.

 

 

qr code playlist

We also tried displaying a set of highly contrasting qr codes as they appeared in grids, and mash well with the grid rendering. The high contrast colors also meant that the shapes were highly responsive to distortions.

Additional links:

21st Mar   28th Mar   31st Mar   4th Apr   11th Apr 14th Apr

Final submission