This is an experimental project to trick people into sensing a ghost by seeing and hearing when they interact with the “air”. They only can see the ghost on the screen projected from the top down camera. However, they can’t see the “real” ghost right beside them. This would make them feel skeptical to the environment as well as the feedback.

It’s programmed to detect the motionregion, when a person step into a specific region, eg. ghost whisper, box moving, mannequin moving, ghost appear in different position according to where you stand, all these feedback will be triggered by the place of a person stand.

Standing position affects the appear of the ghost because when a specific region is triggered, the dumpframe will select the part of video, and show that specific moving image.

Alphablend- pre recorded video overlays with another video which is being recording.  The good thing is, the colour could be adjusted to the extend that I want.

In this project, phidget, top down camera are also used and function as the ghost’s feedback.

Done by Chou Yi Ting

Interactive II – Phidgets – 1. Interactive book 2. A moving skeleton hand

I did 2 parts in this exercise: 1. Interactive book  2. A moving skeleton hand

1. Interactive book

The idea is an interactive book which could open and close by itself when your hand is flipping in the air. The bendable sensor controls the feedback, like waving your hand to the device (Imagine you are flipping the book, but it’s in air).  The motor is installed on the spine of the book.

2. A moving skeleton

I used region motion to trigger the skeleton hand to shift. It can be installed in the room when people walk past, the hand will move by itself and scare them.


In this project we extract the Hip Hop elements such as hip-hop music, scratching, b-boying and the gesture as the instruments. These gestures are then programmed to produce specific sound and clip. We recorded ourself beat-beatboxing and some accapella then proceed to put it in our project.

j (1 of 11) j (2 of 11)

Stepping up and down on the touch sensor will produce the main beat of the song.

j (3 of 11) j (4 of 11) j (5 of 11)


Using the hat like how breakdancer will adjust their hat during their performances, this action will produce a sound.

j (7 of 11)  j (6 of 11)j (8 of 11)

Slicing gesture to make the scratching sound to imitate how hip hop break dancers move.

j (9 of 11) j (10 of 11) j (11 of 11)

Pulling the zip up and down imitating again a hip hop dancer. The zip/slider will change the pitch of the sound.

At the end of the day, we try to make it interesting by introducing our own touch of voice to make this project more personal.

Semester Project – Jingle Balls

Kamarul, Chou Yi Ting, Josephine Cheah, Chen Danning


Knowing our semester was going to end near the festive season, we wanted to create a project with a festive concept, making it colourful and musical.

When we began working on our project, we wanted to create an interactive environment. We wanted to immerse a person into a world of movable particles that would move based on the person’s movements. We looked at various ways that we could do this such as box2D and jit.phys.

After doing some research, we came across jit.phys patches and were inspired by cycling74’s physics patch a day. We watched their videos and wanted to create an interactive project set in a 3D physics world

.Screen Shot 2014-11-20 at 9.21.20 pmScreen Shot 2014-11-20 at 9.21.21 pm

Initially, we studied their patch and wanted to create musical interactions with the audience by integrating region tracking into the patch. The region tracking would activate a bang, triggering the balls to move up into the air and create a sound.

While we were working on our project, we also explored other methods of interaction and came across a patch that allowed us to use the pictslider to pick up an individual ball to move it around. From here, we decided that to increase immersion into the environment by allowing people to move around in the world as a ball itself to interact with the other balls.

We took this patch and linked it with collisions, having different collisions producing different sounds. We experimented with different sounds and different colours, and in the end, producing our final product. When a ball touches the sides or hit another balls, it will produce a sound, that is similar to the Christmas chimes.

Screen Shot 2014-11-20 at 9.20.17 pmScreen Shot 2014-11-20 at 9.20.19 pm

Looking at the different colours, we felt that the balls looked like Christmas ornaments. Combined with the production of sound, we named out project Jingle Balls, based on the song, Jingle Bells.

Screen Shot 2014-11-21 at 10.43.35 am

Emulation -Laser Graffiti

The reference of  the emulation project is  Graffiti Research Lab.

Initially, we used findbounds to track the colour of the laser light but it did not work well for laser light  but it could be used for object.

Tracking -jit. blobs.centroid. The threshold value is the blob size to track the laser pointer’s size. If the value is bigger than  certain number, it can’t track the dot of the light. Blobs with smaller area than the threshold, value will be rejected.



Calibration -We lifted up the computer’s camera and used jit.rota to zoom in by increasing the x and y.


Projection -use paintoval by adjusting the stroke size and colour. The output is transferred to the mirror screen of Mac, so we could control the patch by the main screen and also watch the output that is being played from the projector.




The most important thing is, blobs.centroid is very sensitive but the most accurate, hence the environment is completely dark. The dimension of the projection must be the same as the dimension of the image that the camera is being received.


By Danning and Yi Ting