The Underwater Flight – Final Submission

Abstract: What if birds had to fly underwater to migrate from one place to another? In 2800, there is lesser land for birds to stopover while migrating due to rising sea levels. As a result, they evolved to be able to fly underwater, following the currents of the sea to reach their destination.



See Project – In your own head

See Project – In your own head

By creating a dissonance between virtual reality and physical reality, users are challenged to trust their instincts as well as the virtual environment.

Done by Ashley, Gwendolyn, and Jessie.


First sketch – Space A and B created on the same Map

Initially, we went from teleporting from one place to another in the same map. Once the user reaches point B, they will be able to move x5 of the distance that they move in real life.


Second sketch – Level #1 and #2 are separate maps and the door is a Portal

Then, we tried teleporting between different maps instead. Also tested the space with the VR headset.

Audio Experience – Seeds


Can seeds talk? What happens when they all “talk” together? Would it be chaotic, relaxing, or disruptive? Exploring the use of seeds to create an audio experience that creates ASMR (autonomous sensory meridian response), a relaxing, often sedative sensation that begins on the scalp and moves down the body. Also known as “brain massage,” it is triggered by placid sights and sounds such as whispers and crackles. 

An audio experience that (hopefully) soothes and relaxes your mind :) 

  1. Please listen with earphones 
  2. Please listen with your eyes closed :)



  • (CHIA SEED) Roll around from right ear to left ear and end at the right ear (x2) 
  • Roll down right ear
  • Roll down behind the head
  • Roll down the left ear
  • Roll down the front of the head


  • Transition to (BIRDSEED)
  • Roll up to the top


  • Roll around from right ear to left ear to the right again


  • Open container


  • Drop seeds from above
  • Drop seeds from left
  • Drop seeds from right


  • Circle around the head with the seed tray


  • BOOM in your face


References: Final Performance Documentation

Done by Daryl, Yenee, and Ashley


Location: Truss Room is a performance inspired by the ‘Butoh’ dance. ‘Butoh’ dance is frequently regarded as surreal and androgynous and focuses on primal expressions of the human condition rather than physical beauty. The performance involves a conductor (one of us) who controls the rhythm of the backing track and instruments that are playing in the space. 3 other performers will be controlling an instrument each and 1 will be controlling the light projection. In total, there will be 5 performers. We hope to encourage our performers to move with their feelings.


Technologies used

Devices used: 3 computers, 5 phones, 3 projectors, speaker


Zigsim: to obtain values like gravity/acceleration/gyro/2d touch from the phone


TouchDesigner: for light projection, “middle man” between Zigsim and Ableton.

One computer is used to take in values from 3 phones (that controls one instrument each). It controls which note to play based on how high or low the phone is (gravity values). 3 performers will hold 1 phone each.

The values are connected to another computer that controls the sounds and rhythm on Ableton. The rhythm is controlled by how high or low the phone is (gravity values). This phone is held by the conductor (Daryl)

The light particles will move according to the gravity, acceleration, and gyro values from the last phone. 


Ableton: to play the backing track and instruments


Flow of performance

In order to ease our performers into performing, one dancer (Daryl) will move together with them and controls the rhythm of the performance. Instructions are also given throughout the performance by a speaker (Yenee). The conductor (Ashley) will play certain tracks (intro sound, breathing sound, solo tracks for each instrument) based on the instructions given by the dancer/speaker. 

The instructions can be found here: Flow Motion Script


Video documentation

Full performance:


References Update

Tech updates:

We managed to use values from ZigSim and used TouchDesigner to process it and translate it to Ableton to play the notes of different instruments. 

Venue: Truss Room 

Participants: 5 participants (1 controls the light, 1 controls the beat/drum, the other 3 will control 3 different instruments.

The technology needed: 5 phones, 1 projector, speakers, possibly 2/3 computers

Moodboard for music: 

The expected flow of performance: 

  • A device (phone) will be attached to a participant depending on the instrument they are controlling. (E.g. person controlling tempo will need lesser mobility, thus the device will be attached to their forearms/head. The person controlling one instrument would need more mobility, so the device can be attached to their palms.)
  • The participant will enter the room and will be asked to lay on the floor.
  • Instructions will be given through a voice track initiating the performance (To prep participants mentally to enjoy the performance/move with their feelings)
  • A backing track will be played throughout the whole performance.
  • Participants will move their bodies and different movements/different positions will trigger their instrument/light to change note/colour of light projection
  • The end of the performance will be indicated by the light projection turning black and the volume of the instruments, the beat, and the backing track eventually decreasing to no sound.

Weekly Plan:

Week 11: Refinement (Values from Zigsim + Music Refinement)

Week 12: Testing out all components (not at actual location)

Week 13: Performance at the actual location