In this project, we set up a booth where we invite people to get their fortune read by a virtual fortune teller.
Little did they know that we are out to scare them and capture their reaction.
Conceptually, this project was originally out to capture photographs of people whenever they enter a room, in a form of paparazzi style photographs. This was to question the idea of privacy and we wanted to post those photographs on twitter. Fortune Taker was born after much discussion on how to make the project stronger.
We have the strobe light to light up the face of the participants and also add an element of surprise.
The iSight camera was used to capture photographs and video of their reactions.
The tea box sensor was used to activate the video when the participant lean back onto their chair.
Headphones to help the participants to immersed themselves better.
Two separate computers and patches were needed to make this project a successful one.
Always wanted to control your friend’s webcam position?!
With Spycam, you are able to spy your friend’s private space by invading his space that he doesn’t want you to see!
When one is chatting or skype-ing on video with another, the video is motionless and still. It would be interesting if both parties are able to control each other’s webcam camera as you can also explore the other side.
Initially, my idea with the motor was to create an eye that follows one faces when one is moving through the Face Tracking device. Then I thought to myself, why not be a real eye. The closest thing one is able to see the other side is through the other person’s webcam. Hence, this idea was explored with the help of iSight and some tape!
This add another layer as you question the moral action of this device whether one’s private space is really private when one is webcaming through the internet.
Attaching the Teabox’s gyroscope, the user is able to follow the motion from left to right, mimicking the motion of the iSight, controlling the iSight position.
In this project we extract the Hip Hop elements such as hip-hop music, scratching, b-boying and the gesture as the instruments. These gestures are then programmed to produce specific sound and clip. We recorded ourself beat-beatboxing and some accapella then proceed to put it in our project.
Stepping up and down on the touch sensor will produce the main beat of the song.
Using the hat like how breakdancer will adjust their hat during their performances, this action will produce a sound.
Slicing gesture to make the scratching sound to imitate how hip hop break dancers move.
Pulling the zip up and down imitating again a hip hop dancer. The zip/slider will change the pitch of the sound.
At the end of the day, we try to make it interesting by introducing our own touch of voice to make this project more personal.
This is the first video we make and we will be making another soon.
Our reference for our Emulation was the classic iPod commercial where there are different people dancing while listening to their iPod through their earpiece. It is a very simple and effect commercial for Apple, showing only their silhouettes and the iPod.
For our emulation project, we aim to achieve the same effect and to make things more fun, we decided to add another element into it. The music will only play if we are dancing, and it will stop when we are not making any move. This makes it more interactive than just silhouette.
Here’s the video of our documentation. I hope you enjoy it and you guys can try playing it tomorrow!