Semester Project Outline

Group: Clara, Sailin, Sin Yee, Yi Xian

Concept/Inspiration
Our project is inspired by the question “If a tree falls in a forest and no one is around to hear it, does it make a sound?” What makes a sound a sound, an image an image, or by extension, a presence a presence? Is a presence only a presence when it can be felt? If you don’t have a presence in the space, are you still present?

Project Description
Our interactive installation will take place in a dark space (either an enclosed room or a public area with more traffic). Each user body will be assigned a sound (e.g. footsteps, heartbeat, breathing etc.). 2 bodies passing or approaching each other will affect the frequency of the sounds (similar to the Doppler effect).

Users will be in a dark room and their movements will be displayed on a wall projection. Their positions on the floor will correspond to their positions in the wall projection. As the dark space will create low visibility, users will detect each others’ presence and understand the surrounding environment from the visual projection.

We will represent the visual aspect of presence in 2 ways:

  1. The background will be filled with particles and when the user is stationary, the particles will gather to their position. When they move the particles will disperse and the users movement will not be recorded till they are once again stationary in a new position.
  2. The background will be filled with noise/static. Being stationary will produce a void in the background, indicating the user’s position. When the user moves, the background will revert to its usual state.

Instead of representing one’s presence by tracking a person’s motion around the space, we decided to only visually represent stationary positions.

System

We are planning to make use of adjustment in pitch of sound for the doppler effect. We are considering using multiple speakers based on the size and scope of the location. Visually, we are planning to make use of motion detection and blob tracking.

Timeline

20 Oct:

  • finalize project proposal
  • Set up basic patches for blobs tracking and sound
27 Oct:
  • [Visuals] Display the static visuals based on the blobs tracking output
  • [Sound] Assign same sound to each participant and change frequency of the sound when participants pass by each other
3 Nov:
  • [Visuals] Replace the static visuals with particles
  • [Sound] Assign individual sound to different participant and play with volume and frequency
10 Nov:
  • Integrate the patches and test the them in real-time
  • [Visuals] Adjust the position of the visuals and explore 3D visuals if possible
  • [Sound] Try stereo effect and adjust the position of speakers
17 Nov:
  • Finish up the patches and documentation

Leave a Reply