Emulation Pair Work Project Pearlynn & Esmond
Our reference video, “The Piano Stairs – TheFunTheory.com”
This fun, interactive staircase mimics a piano, and plays the corresponding note when people climb the stairs. Each note corresponds to a single step.
We took the idea of people walking to create music and emulated the major scale (Do, Re, Mi, Fa, …) into 8 sections. When the Read more →
By: Faye and Peng Cheng
Patches of Mouth’s and Mustache : jit.lcd to draw image onscreen. We removed the toggle and metro of the jit.lcd it does not call off the image after the bang is sent (which causes multiple images to be overlayed)
Overlaying images on top of the carrot: jit.op @min for the Mustache and Read more →
The reference of our emulation is from Benayoun Maurice, titled “Tunnels around the World”. (2013)
We liked and wanted to achieve the zooming effect of the images as a transition into another scene. For our emulation, instead of images, we used videos. And the trigger, using face detection instead of motion tracking. We also added sound Read more →
Student Project Team:
-Jonathan Ming Chun Yu (IEM/4) -Li Yihan (IEM/4)
This is our Student Emulation Project which allows users to play a pre-recorded sound bite/music when the “cap” is removed from the bottle itself. We applied the concept of motion sensor we learnt in class to be able to detect the caps itself. Hope you enjoy it!
Description Of Emulation:
By assigning different musical melodies Read more →
Here’s our emulation attempt at letting people draw on their screens – temporarily. The patch is included in the last part of the video.
Our inspiration came from :
The video documents an Interactive Art Installation by students studying Medialogy in Aalborg University. In this course, students study the technology behind design in film, games and animation. They used a color tracking patch to create a Read more →
The reference of the emulation project is Graffiti Research Lab.
Initially, we used findbounds to track the colour of the laser light but it did not work well for laser light but it could be used for object.
Tracking -jit. blobs.centroid. The threshold value is the blob size to track the laser pointer’s size. If the value is bigger than certain number, it Read more →
Greetings to the good people of Interactive 1,
Our reference for our Emulation was the classic iPod commercial where there are different people dancing while listening to their iPod through their earpiece. It is a very simple and effect commercial for Apple, showing only their silhouettes and the iPod.
For our emulation project, we aim to achieve the same effect and to Read more →
We are in zen mode for our emulation. See it, feel it, listen to it and relax.
Our emulation allows user to play the different sound of bells we selected by just waving hands while standing/sitting in front of the screen. It is easily expandable to more sounds but for purpose of this project, we kept it to 7.
This will be our emulation.
Phua Wei Jun, Kwok Ming Sheng, Lester Leong
1. Red Strawberry: Chinese Bamboo Flute
characteristic of the instrument: high-pitch, lucid and vivid, vivacious and leaping rhythms
2. Orange Carrot: drum? A bongo maybe?
Anthem of Carrot Day:
3. Yellow Bananas reminds me of Carribean sea but something a bit heavier, which best described by the tone of a bass trumpet; as an instrument trumpet is warm and rich, breathy and burnished.
4. Green Read more →
This is our second concept. Our goal is to create bubbles/particles and let it float around within the screen. We will add the sound effect, when we move the bubbles, the tune of the sound will change. Reference:
Chou Yi Ting, Chen Danning, Kamarul, Josephine
Proposed emulation project that involves Kinect, MAX/MSP and Synapse (a programme that gets the input data from Kinect and sends it to MAX/MSP).
My goal is to create an interactive electronic instrument window that turns motion into music…. something like this (but smaller scale) http://youtu.be/YERtJ-5wlhM
Preferred presentation on 11th Nov, solo.
Jonathan and I will be emulating the Bobblogue 2000 from Tangible Media Group
Dates preferred for presentation: 11st Nov.
Group 4. Wei Jun, Ming Sheng & Lester.
A display of 5 x 5 screens with a camera placed around the center.
Using face/eye tracking, it detects the direction of the viewer’s face/eyes and on the screen of that direction.
It will play the recording of the viewer.
At certain intervals, face/eye tracking will detect and on another screen, playing the same recording, but Read more →
Group 1 : E.C.V.P
Esmond, Cindy, Vivian, Pearlynn
Our group will be coming up with a interactive projection that transforms the staircase into a giant fish tank.
We will be making use of motion tracking such that when a person moves through, fishes follow.
But when the person stops, the fishes swim away.
This will encourage people to use the stairs more instead of the lift, Read more →
Our concept is based on the movement of the viewer. Using face detection, the viewer when stationery will see his/her face covered with fuzziness/dark glitches around him. In order to get rid of the fuzziness and see his face clearly, he must move around so that the “fuzziness” doesn’t consume him. This idea is based on the game ‘Tilt To Read more →
Jonathan, Yihan, Wendy
Presentation Date: 18th November 2014
By translating the intensity level of the game into the music/visual, it allows us to make the game itself more exciting.
We will be expanding the idea on the game of twister by incorporating 4×6 spots available places for the players to place their hands and feet upon. There will be 4 different colours Read more →
Yi Ting, Kamarulzaman, Josephine, Danning
The main concept of our group project is ‘face blur’ .
It’s like duplicating faces in a short time, like 1/5 sec duplicate one face. Then the faces will be overlapped on the previous ones.Faces are produced at certain intervals. The body is motionless, but the head needs to turn. So when Read more →
G2: Suhui, Peng Cheng, Faye & Jazlyn
We will be mixing the use of dubsteps/sounds with the Mavericks version of the face tracker.
Upon recognising a person’s face within the sensor range, the computer will begin to play a steady, unchanging beat. This encourages people to look at it. As their facial expression changes, Max generates various beats or sounds mapped Read more →