All posts by Yi Xian

Aspiring artist and researcher who seeks to create playful interactive experiences by altering everyday objects and appropriating collective experiences.

Stationery Radio

Documentation for Stationery Radio using Max 7, teabox and assorted sensors. An interactive stationery holder that allows easy control over your sound system while keeping desk supplies organised.

This player incorporates basic desk stationery and 4 different sensors into the interface. The eraser acts as a switch to start and stop the radio by playing with the light sensor. A ruler is used to changes between tracks. It triggers the bending sensor when it’s pushed into the ruler slot. The slider determines the duration and end point of each track. Volume is controlled using the infrared distance sensor and pencil holder slots. The 4 slots correspond to 4 increasing volume levels. Uses mainly buffer~ to control audio. A very engaging first project;

20160125_205249 (3)     20160125_212406 (2)

Thoughts on interdisciplinarity

I typically like using traditional media to create art but being in IM has allowed me to stretch my skills and control of media. Learning programming, max and all these other really new and fresh fields is no different from drawing or painting. All are mediums used to create art equally but add so much more to my appreciation and application of these individual mediums. Applying the logical to painting and vice versa makes art way more flexible and feels like I get to dip my fingers in many different pots of honey.

Semester Project Outline

Group: Clara, Sailin, Sin Yee, Yi Xian

Concept/Inspiration
Our project is inspired by the question “If a tree falls in a forest and no one is around to hear it, does it make a sound?” What makes a sound a sound, an image an image, or by extension, a presence a presence? Is a presence only a presence when it can be felt? If you don’t have a presence in the space, are you still present?

Project Description
Our interactive installation will take place in a dark space (either an enclosed room or a public area with more traffic). Each user body will be assigned a sound (e.g. footsteps, heartbeat, breathing etc.). 2 bodies passing or approaching each other will affect the frequency of the sounds (similar to the Doppler effect).

Users will be in a dark room and their movements will be displayed on a wall projection. Their positions on the floor will correspond to their positions in the wall projection. As the dark space will create low visibility, users will detect each others’ presence and understand the surrounding environment from the visual projection.

We will represent the visual aspect of presence in 2 ways:

  1. The background will be filled with particles and when the user is stationary, the particles will gather to their position. When they move the particles will disperse and the users movement will not be recorded till they are once again stationary in a new position.
  2. The background will be filled with noise/static. Being stationary will produce a void in the background, indicating the user’s position. When the user moves, the background will revert to its usual state.

Instead of representing one’s presence by tracking a person’s motion around the space, we decided to only visually represent stationary positions.

System

We are planning to make use of adjustment in pitch of sound for the doppler effect. We are considering using multiple speakers based on the size and scope of the location. Visually, we are planning to make use of motion detection and blob tracking.

Timeline

20 Oct:

  • finalize project proposal
  • Set up basic patches for blobs tracking and sound
27 Oct:
  • [Visuals] Display the static visuals based on the blobs tracking output
  • [Sound] Assign same sound to each participant and change frequency of the sound when participants pass by each other
3 Nov:
  • [Visuals] Replace the static visuals with particles
  • [Sound] Assign individual sound to different participant and play with volume and frequency
10 Nov:
  • Integrate the patches and test the them in real-time
  • [Visuals] Adjust the position of the visuals and explore 3D visuals if possible
  • [Sound] Try stereo effect and adjust the position of speakers
17 Nov:
  • Finish up the patches and documentation