Recent Posts
Interactive II Progress (11th Apr)
After getting some feedback about our work, we’ve decided to change a few things. First of all, we’ve decided to use the amount of color on the screen, rather than the user’s face to change the ambient music. The reason for this was due to the fact that Jit.faces works intermittently, and required the user to maintain full frontal eye contact. Instead, Read more →
[Interactive 2] Process - Milestones (4 Apr)
“Cattitude” by Putri Dina and Hannah Kwah
Milestones Project Proposal | 28th Mar | 4th Apr | 11th Apr | 12th Apr | 15th Apr | 16th Apr | 17th Apr | Final Documentation
Our group was Read more →
Interactive 2: Documentation and Progress VI
MEMORIES OF SOUND
Bao Song Yu & Zhou Yang
The following are the screenshots for the patch we did for our interactive installation.
The following is the video documentation of us trying out the Max patch. We used light sticks and several colored materials to test out the patch.
https://www.youtube.com/watch?v=x7FPp4hPTIQ
The Read more →
Interactive II Progress (4th Apr)
We’ve decided to add cv.jit.faces into the patch to detect the user’s face position and scale in relation to the screen to affect the type of music that will be played. We’ve improved the patch up so that the overall contrast and saturation of the displayed image will be increased depending on how close the person Is to the screen, rather Read more →
Interactive 2: Documentation and Progress V
MEMORIES OF SOUND
Bao Song Yu & Zhou Yang
For our final project, we decided to shift away from the idea of creating an interactive installation depicting the experience of walking in a cave in a dark room. We feel that the cave experience is cliche and does not add meaning to our project of using Read more →
Interactive II Progress (31st Mar)
We found this youtube tutorial on how to make images into a mesh and affect it with audio input. This takes the brightest points and extrude outward to the viewer. see here
We altered this by taking image from the webcam instead of using images as in the youtube tutorial. For the webcam image, we got it to randomise brightness, contrast and Read more →
Interactive 2: Documentation and Progress IV
As our project required the use of color mediums and body movements to trigger audio outputs as a form of interaction for our planned interactive installation, we tested out some patches to look for various possibilities.
Triggering of audio outputs by recognizing the position of the held object (Yellow object) on the screen. Different audio Read more →
Interactive II Progress (28th Mar)
Winzaw and I looked at some tutorials found on youtube and tried to mash some of them together.
So the patch records voice for 8 seconds and playback in a robotic tune for twice over. and this audio playback will affect the particles to generate some visual feedback. Afterwhich the process happens again automatically. had to create a randomizer to randomize Read more →
[Interactive 2] Milestone - 28 Mar
“Cattitude” by Putri Dina and Hannah Kwah
Milestones Project Proposal | 28th Mar | 4th Apr | 11th Apr | 12th Apr | 15th Apr | 16th Apr | 17th Apr | Final Documentation
Particle Videos
Resources EditorX: http://infusionsystems.com/catalog/product_info.php/products_id/403
Touch v1.5: http://infusionsystems.com/catalog/product_info.php/products_id/135
Max Read more →
Interactive 2: Documentation and Progress III
After much discussion and refinement, we have decided to create a cave experience through the use of audio in a dark room setting. A confined dark room is a perfect metaphor for a cave. Each participant will be given light sticks to placed on both his/her ankles. These light sources served as trackers for the cameras put in Read more →