Tag Archives: semester project

ume | documentation

Prototype documentation for our semester project, ume.

ume are interactive paired devices meant to subtly connect friends, lovers and family in different locations. It was made using Arduinos, LED bulbs, PIR sensors and ultrasonic distance sensors.

Each ume serves as an avatar of the other user. The ume in location A will correspond to the user in location B and vice versa. It aims to subtly capture the presence of each user and provide company and comfort without bombarding the user with too much information. In our era of social media and instant communication, ume aims to take a step back in our how we experience one another’s presence and filter out the buzz.

The ume emulates a flame. When users are closer to the sensor, the flame will burn stronger. When further away, the flame will flicker more and become dimmer. The umes will turn into a warm flame and start rocking gently when it detects motion in the room (e.g. if the user paces around or types on their keyboard). Conversely, when no motion is detected (e.g. user falls asleep or has left) the ume will turn into a cool blue flame and stay still. This alludes to the partner user what their partner is doing at the moment.

20161118_032819

20161118_032831

Changes since the last update

  • We attempted to use multilooping and protothreads to execute both the distance sensing and rocking motion at the same time as the latter was slowing down each loop. However, each thread was still waiting for the other and the time saved was marginal. So instead we programmed the distance sensor to take a reading at smaller increments of the motor motion.
  • We placed the bread board, motor and arduinos into each hamster ball and used long wires to simulate 2 different locations. We added 2 small plates at the side of the PIR sensor to limit its range when detecting motion. In terms of aesthetics, we had to cover up the arduinos, wires etc in the hamster ball as they were rather distracting. So we decided to wrap the ume up in lace to make it more intimate and homely as the ume is meant to be used in a private or home setting. This also helped to disperse the light and create a softer effect. However, this added friction to the base and reduced the rocking motion.
  • We also added a fade in and fade out effect between the 2 states. This allowed the lights to transition more naturally and we think it helped describe a person’s presence more organically.
    whatsapp-image-2016-11-20-at-12-15-10

General feedback from users

  • Users often expected that the sensors would affect the ume on the same side. They were also unsure whether they could hold them or move them around.
  • Users also remarked that they were cute and liked the spherical lace body.

Possible improvements

  • The ume could reflect more gestures and states of the users other than motion and proximity to enrich the telepresence experience. The challenge is balancing between creating a good organic communication, without sending over too much explicit information and complicating the interaction.
  • Instead of wrapping the hamster balls in lace, for a 2.0 version we would want to either do a decal or spray on a porous lace pattern which would allow light to pass through yet cover up the technical bits.
  • Another improvement would be to make it wireless so the interaction would be more tactile and physical. Users could hold onto the ume should they wish to or leave it aside to rock gently.20161118_152301

Part 1 of documentation (interaction / concept):

Part 2 of documentation (technical):

A project by Tania and Yi Xian.

Semester Project Outline

Group: Clara, Sailin, Sin Yee, Yi Xian

Concept/Inspiration
Our project is inspired by the question “If a tree falls in a forest and no one is around to hear it, does it make a sound?” What makes a sound a sound, an image an image, or by extension, a presence a presence? Is a presence only a presence when it can be felt? If you don’t have a presence in the space, are you still present?

Project Description
Our interactive installation will take place in a dark space (either an enclosed room or a public area with more traffic). Each user body will be assigned a sound (e.g. footsteps, heartbeat, breathing etc.). 2 bodies passing or approaching each other will affect the frequency of the sounds (similar to the Doppler effect).

Users will be in a dark room and their movements will be displayed on a wall projection. Their positions on the floor will correspond to their positions in the wall projection. As the dark space will create low visibility, users will detect each others’ presence and understand the surrounding environment from the visual projection.

We will represent the visual aspect of presence in 2 ways:

  1. The background will be filled with particles and when the user is stationary, the particles will gather to their position. When they move the particles will disperse and the users movement will not be recorded till they are once again stationary in a new position.
  2. The background will be filled with noise/static. Being stationary will produce a void in the background, indicating the user’s position. When the user moves, the background will revert to its usual state.

Instead of representing one’s presence by tracking a person’s motion around the space, we decided to only visually represent stationary positions.

System

We are planning to make use of adjustment in pitch of sound for the doppler effect. We are considering using multiple speakers based on the size and scope of the location. Visually, we are planning to make use of motion detection and blob tracking.

Timeline

20 Oct:

  • finalize project proposal
  • Set up basic patches for blobs tracking and sound
27 Oct:
  • [Visuals] Display the static visuals based on the blobs tracking output
  • [Sound] Assign same sound to each participant and change frequency of the sound when participants pass by each other
3 Nov:
  • [Visuals] Replace the static visuals with particles
  • [Sound] Assign individual sound to different participant and play with volume and frequency
10 Nov:
  • Integrate the patches and test the them in real-time
  • [Visuals] Adjust the position of the visuals and explore 3D visuals if possible
  • [Sound] Try stereo effect and adjust the position of speakers
17 Nov:
  • Finish up the patches and documentation