INTERACTIVE II: SEMESTER PROJECT – Phantom

This is an experimental project to trick people into sensing a ghost by seeing and hearing when they interact with the “air”. They only can see the ghost on the screen projected from the top down camera. However, they can’t see the “real” ghost right beside them. This would make them feel skeptical to the environment as well as the feedback.

https://www.youtube.com/watch?v=y7PTzvySZ4M

It’s programmed to detect the motionregion, when a person step into a specific region, eg. ghost whisper, box moving, mannequin moving, ghost appear in different position according to where you stand, all these feedback will be triggered by the place of a person stand.

Standing position affects the appear of the ghost because when a specific region is triggered, the dumpframe will select the part of video, and show that specific moving image.

Alphablend- pre recorded video overlays with another video which is being recording.  The good thing is, the colour could be adjusted to the extend that I want.

https://www.youtube.com/watch?v=cwGZaqwpCGs

In this project, phidget, top down camera are also used and function as the ghost’s feedback.

Done by Chou Yi Ting

Interactive II – Phidgets – 1. Interactive book 2. A moving skeleton hand

I did 2 parts in this exercise: 1. Interactive book  2. A moving skeleton hand

https://www.youtube.com/watch?v=Bi7luu70PYM

1. Interactive book

The idea is an interactive book which could open and close by itself when your hand is flipping in the air. The bendable sensor controls the feedback, like waving your hand to the device (Imagine you are flipping the book, but it’s in air).  The motor is installed on the spine of the book.

2. A moving skeleton

I used region motion to trigger the skeleton hand to shift. It can be installed in the room when people walk past, the hand will move by itself and scare them.

INTERACTIVE II – HIP HOP – TEABOX work with Kamarul

In this project we extract the Hip Hop elements such as hip-hop music, scratching, b-boying and the gesture as the instruments. These gestures are then programmed to produce specific sound and clip. We recorded ourself beat-beatboxing and some accapella then proceed to put it in our project.

https://www.youtube.com/watch?v=6jCzLQ9ZfrQ

j (1 of 11) j (2 of 11)

Stepping up and down on the touch sensor will produce the main beat of the song.

j (3 of 11) j (4 of 11) j (5 of 11)

 

Using the hat like how breakdancer will adjust their hat during their performances, this action will produce a sound.

j (7 of 11)  j (6 of 11)j (8 of 11)

Slicing gesture to make the scratching sound to imitate how hip hop break dancers move.

j (9 of 11) j (10 of 11) j (11 of 11)

Pulling the zip up and down imitating again a hip hop dancer. The zip/slider will change the pitch of the sound.

At the end of the day, we try to make it interesting by introducing our own touch of voice to make this project more personal.