Time

The time now is 7:39 PM. It might be absolutely unnecessary since the post would automatically update the post. Technology is advanced these days. Imagine if we’re back at the old days. We think about it a lot, past and now.

How about time traveling? If I could travel back in time to my younger self. What should I have done to change this present? Would it matter much?

babysharks-minority-report-time-travel1

Semester End Project: Butterfly Effect

Video download link: https://www.dropbox.com/s/gy3qr9q6i3u6btn/The%20butterfly%20effect.mov?dl=0

The Butterfly Effect

We chose to emulate a classic example of interactive art installation based on motion tracking.

Once a person is detected on camera, there will be projected images of butterflies hovering on the person on screen.

When the person leaves, the butterflies disappears as well.

We call this the Butterfly Effect; that the scientific theory that a single occurence, no matter how small, can change the course of the universe forever.

How it works:

Our patch has 2 main sections; the first is boids2d and the other is cv.jit.centroids.

Boids Section        Centroids sectionThe boids section enables us to generate boids and input parameters to modify how they follow a specific point. This point is calculated from the centroids section, which can either take in a recorded movie or a live feed from the camera.

Butterflies disappear

The centroids section also enables us to control the timing when the butterflies appear. In our case, we wanted the butterflies to appear only when a person is detected. Hence, if the output of cv.jit.centroids was bigger than average (meaning that a person is detected), a toggle will be activated to generate the image of the butterflies.Scaling of butterflies

Another big section of our patch was on the calculation of the size of the boids. This had to match with the size of our camera/movie input and also the size of our butterfly image. Each butterfly coordinate was taken and the x- and y- values were both used to set the size of each image.

Rotate_Patch(click image to see gif in action)

The butterfly image itself had to go through some transformations. The first was its rotation. Each individual butterfly has its own angle of rotation that is calculated according to its current position compared with the position of the person. Every update of each butterfly and the person was sent into the calculation in accordance with the overall metronome timing. The angle based on 4 different scenarios of positioning, hence we had to create a system where on the correct scenario is chosen every time.

Hue_Frame_Patch(click image to see gif in action)

We also wanted to make each butterfly a different colour. However, we only managed to make each butterfly change colour on every frame. The effect it created, however, is very similar to what was intended. This was done mainly using jit.hue and using the counter’s carry count to change it between 4 different angles of hues. We had to limit the hue variation to reds and yellows because when alphablend-ed into the final image, the blues and greens would end up looking very transparent and invisible.

To produce the final image, we combined all the butterfly boids in a single window and used jit.alphablend to combine it with the live feed/movie input and show the participant how the butterflies follow him around only when he is in the capture area.

 

Difficulties faced:

We initially wanted to project nothing onto the wall and when the viewer walks across the camera vision, the butterflies would appear on the wall and follow him. This follows our initial idea of projecting fishes on the floor as the viewer walked around. However, in the end by combining the butterflies moving with the real-time camera feed, we could produce the image of the butterflies following the viewer and display it for the viewer to see.

 

Limitations:

The angle of vision captured by the camera was a  big limitation because the area in which a person could walk was very small.

Also, when the camera captures the projection and inputs it back to change the projection, the loop it creates progressively darkens the projection and negates some of the values calculated in our patch.

The quality of the video was also limited; when we scaled it up to 1024×800 from 320×240, the lag between each frame became too slow and the butterflies could not track and follow the person accurately.

Done by:
Cindy Chan
Esmond Heng
Pearlynn Yong
Vivian Ng

Music Sensor

Emulation Pair Work Project
Pearlynn & Esmond

Our reference video, “The Piano Stairs – TheFunTheory.com”

This fun, interactive staircase mimics a piano, and plays the corresponding note when people climb the stairs. Each note corresponds to a single step.

We took the idea of people walking to create music and emulated the major scale (Do, Re, Mi, Fa, …) into 8 sections. When the participant steps into one of the areas, the note assigned to that area will be played. More than one person can take part at the same time to create a melody, as shown in our second video.

We could use this emulation as an installation in ADM’s lobby to measure human traffic flow. For example, we can determine:

1) A person’s walking direction (whether upscale or downscale in piano terms)

2) The speed of their walk (through how fast notes are played). The sound played creates a musical ambience which could invigorate joy in students.

Thank you and enjoy!