This is the final result of week’s worth of work. We’ve titled this work Psychedelia.
Notes: Users interact by moving colors in front of the camera. As you can see in the setup. The color affects the type of music that is played, and the music
affects the live camera footage. All the audio you’re hearing in the documentation video is captured live.
we improved on the previous visuals by introducing rotation and scaling to the images. We’ve also added a secondary camera so the user would see how they are affecting the camera. We also edited the audio in Premiere so that the loops are shorter but seamless at the same time. The following clips are the final audio.
amt of blue changes types of drum loops being played
After getting some feedback about our work, we’ve decided to change a few things. First of all, we’ve decided to use the amount of color on the screen, rather than the user’s face to change the ambient music. The reason for this was due to the fact that Jit.faces works intermittently, and required the user to maintain full frontal eye contact. Instead, we decided to use cv.jit.blob.centroids.
We’ve also decided to change the individual length of the music tracks and make them distinct so they are aware that they’re affecting the installation.
The amount of green would change the saw (low pitch ambient sound) and the higher pitched cycle notes, the amount of red would change the pitch and its lateral position will play a different soundtrack. Finally, the lateral position of the blue region would change the type of drum loops that will be played.
We’ve decided to add cv.jit.faces into the patch to detect the user’s face position and scale in relation to the screen to affect the type of music that will be played. We’ve improved the patch up so that the overall contrast and saturation of the displayed image will be increased depending on how close the person Is to the screen, rather than a randomiser.
Also, the lateral position of the user’s face will change the type of music will be played. I took some slightly altered SYNTHI 100 music and edited the clip so that it plays on the loop seamlessly.
qr code playlist
We also tried displaying a set of highly contrasting qr codes as they appeared in grids, and mash well with the grid rendering. The high contrast colors also meant that the shapes were highly responsive to distortions.
We found this youtube tutorial on how to make images into a mesh and affect it with audio input. This takes the brightest points and extrude outward to the viewer. see here
mesh
We altered this by taking image from the webcam instead of using images as in the youtube tutorial. For the webcam image, we got it to randomise brightness, contrast and saturation every 5 seconds.
We also had to deal with positioning and orientation of the webcam input because we wanted it to mirror the viewer. So had to make the mesh orientate in the same way as the matrix we get from webcam. Matrix can be re-orientated with DimMap object, but for the mesh we had to play around with the values of its position in the Jit.World objects.
We also got it to display lines and grids based on the peak amplitude and pitch of the sounds being generated.
Randomizer for saturation, brightness and contrast