Video download link: https://www.dropbox.com/s/gy3qr9q6i3u6btn/The%20butterfly%20effect.mov?dl=0
The Butterfly Effect
We chose to emulate a classic example of interactive art installation based on motion tracking.
Once a person is detected on camera, there will be projected images of butterflies hovering on the person on screen.
When the person leaves, the butterflies disappears as well.
We call this the Butterfly Effect; that the scientific theory that a single occurence, no matter how small, can change the course of the universe forever.
How it works:
Our patch has 2 main sections; the first is boids2d and the other is cv.jit.centroids.
The boids section enables us to generate boids and input parameters to modify how they follow a specific point. This point is calculated from the centroids section, which can either take in a recorded movie or a live feed from the camera.
The centroids section also enables us to control the timing when the butterflies appear. In our case, we wanted the butterflies to appear only when a person is detected. Hence, if the output of cv.jit.centroids was bigger than average (meaning that a person is detected), a toggle will be activated to generate the image of the butterflies.
Another big section of our patch was on the calculation of the size of the boids. This had to match with the size of our camera/movie input and also the size of our butterfly image. Each butterfly coordinate was taken and the x- and y- values were both used to set the size of each image.
The butterfly image itself had to go through some transformations. The first was its rotation. Each individual butterfly has its own angle of rotation that is calculated according to its current position compared with the position of the person. Every update of each butterfly and the person was sent into the calculation in accordance with the overall metronome timing. The angle based on 4 different scenarios of positioning, hence we had to create a system where on the correct scenario is chosen every time.
We also wanted to make each butterfly a different colour. However, we only managed to make each butterfly change colour on every frame. The effect it created, however, is very similar to what was intended. This was done mainly using jit.hue and using the counter’s carry count to change it between 4 different angles of hues. We had to limit the hue variation to reds and yellows because when alphablend-ed into the final image, the blues and greens would end up looking very transparent and invisible.
To produce the final image, we combined all the butterfly boids in a single window and used jit.alphablend to combine it with the live feed/movie input and show the participant how the butterflies follow him around only when he is in the capture area.
We initially wanted to project nothing onto the wall and when the viewer walks across the camera vision, the butterflies would appear on the wall and follow him. This follows our initial idea of projecting fishes on the floor as the viewer walked around. However, in the end by combining the butterflies moving with the real-time camera feed, we could produce the image of the butterflies following the viewer and display it for the viewer to see.
The angle of vision captured by the camera was a big limitation because the area in which a person could walk was very small.
Also, when the camera captures the projection and inputs it back to change the projection, the loop it creates progressively darkens the projection and negates some of the values calculated in our patch.
The quality of the video was also limited; when we scaled it up to 1024×800 from 320×240, the lag between each frame became too slow and the butterflies could not track and follow the person accurately.