dotw1: Song Wig

Song Wig is an ingenious and playful piece of wearable tech by Japanese creative lab PARTY. This interactive device offers a new way of sharing music. The user experience has a large social element, as the users interact with the device as well as with one another.

One main user wears the device and shares music with physical proximity. It works similar to wireless headphones and syncs with devices via bluetooth. In terms of interactivity, it has limited user input and feedback. However, the metaphor and the resultant interaction that it generates between people makes up for this. It’s simple yet the affordances are easily identifiable. It also encourages physical interactivity between users.

 

system_diagram_en

 

find out more at http://songwig.com

 

Stretch | semester project

Stretch is an interactive art installation which invites participants to interrupt and manipulate a stretch of time through hand gestures. It distorts sound and movement, mainly through granular synthesis, ultimately creating a frenzied yet stimulating interactive experience.

Participants wear gloves and play with an unfolding cube to control the screen projection and sound in the space. The wooden cube is the main point of interaction and guides the participants’ gestures. The setup includes a base with a pressure sensor to detect if the cube is lifted off the base and in use; this starts up the sound and visuals. Each glove has a gyroscope attached to capture the gestures made. Additional data from the right hand is input into MuBu, a machine learning system, in order to detect when participants shake the cube. The left hand also has a bending sensor to measure grip.

20160422_134112
Part I

The screen projection is made out of several familiar head actions such as a rotating stretch (typically used for exercise warm-ups) and nodding. The head movements correspond to and act as feedback to the gestures and sound. The visuals were inspired by Modell 5 by art duo Granular Synthesis, as well as the warped portraits of Francis Bacon. I found the face and head to be a suitable subject as we are immediately drawn to it and it catches our gaze.

However, on hindsight, I would have built a physical setup consisting of objects such as pendulums and weights, and filmed clips of it instead of the face. Although the face has its advantages and recording footage of it is much easier as it does not require extra time to build, I do feel that it possibly adds another layer of narrative. This may divert the participants’ attention away from the motions and sounds, to the identity or purpose of the person.

Part II explains the technical elements behind the work:

Screenshots of Max patch:

Screen Shot 2016-04-21 at 6.57.04 PM Screen Shot 2016-04-21 at 6.56.34 PM Screen Shot 2016-04-22 at 1.37.52 PM
Screen Shot 2016-04-22 at 1.36.35 PM Screen Shot 2016-04-22 at 1.37.00 PM

Tissue please I’m going to sneeze!

Prototype documentation for phidgets feedback exercise. Made using Max 7 and Phidgets.

Inspired by the playful and nonsensical nature of Chindogu (‘un-useless’ inventions), I decided to make my own ‘un-useless’ device with some added interactivity. ‘Tissue please’ is an interactive device and the perfect companion for anyone with the sniffles. It offers a tissue when you sneeze and wishes you good health to keep the germs away.

Screenshot (61)

 

Pan n’ Tilt 2.0

Documentation for second iteration of pan and tilt, now including lights to enhance the overall experience. Inspired by the movement and sound of water.

The four corners of the gyroscope correspond to the 4 lights and speakers. Tilting the gyroscope downwards in one corner intensifies the blue and brightness of the light, and increases the volume of the corresponding speaker. This decreases the volume of the speaker in the opposite corner, and drains the blue and brightness from the light, making it appear whiter and dimmer.

Improved the sound increment from the first iteration of pan & tilt by using the table object. The volume transition from when the gyroscope is leveled to being tilted is now smoother and more gradual.

Note: Second half of video has no sound but will fix ASAP! Edit: sound fixed

Playin’ around with surround sound

Documentation for pan n’ tilt prototype. This setup allows the participant to control the movement and dispersal of sound in the room using the gyroscope. Each of the four corners of the gyroscope correspond to each of the four speakers in the room. The effect is immediate; for example, tilting the top right corner of the gyroscope downwards will increase the volume of the ‘front-right’ speaker (position when the participant faces the screen projection).

The accompanying graphics on the screen are a visual representation of the sound dispersal. The free-flowing polygon has 4 corners which correspond to the 4 speakers in each corner of the room. Tilting the gyroscope in one corner will alter the shape of the polygon and stretch it furthest in the same direction of the speaker. Conversely, keeping the gyroscope leveled will produce equal volume in all 4 speakers and the polygon will become a rectangle shape.

3D graphics and a more balanced sound increment (using the table object) could be applied for the next iteration of this prototype to enhance the experience.

Alternative Rainstick

Documentation for electronic musical instrument prototype. Created using max 7 and Teabox sensors. The Alternative Rainstick is a dual sound musical instrument which marries elements of a traditional rainstick and monophonic keyboard. It produces 2 different types of sounds, a base rain noise and an octave of notes. The pitch and type of sound can be easily controlled for both.

This rainstick uses the umbrella’s affordances in combination with 4 different sensors, namely the gyroscope, bending sensor, pressure sensor and slider.

The gyroscope controls the pitch of the rain sounds and is paired together with a swinging motion. The degree the umbrella is tilted corresponds to pitch i.e. tilting the umbrella downwards will produce a low pitch. The volume of the rain is controlled by the bending sensor and how open the umbrella is. A fully opened umbrella would produce a dense rain sound at full volume. Similarly, closing the umbrella will create softer rain sounds.

The umbrella’s ‘open’ button, attached with a pressure sensor acts as the universal ‘key’. Note sounds are only produced when the button is pressed. It functions similar to a piano key: the note is sustained by keeping the key pressed and stops when it is released. The variation between and forte and piano can be achieved by simply pressing the button lighter or harder. The slider controls the pitch and has a one octave range.

Max patch:

Screen Shot 2016-02-11 at 6.41.40 PM

 

Thoughts on Zimmerman’s four concepts

Narrative, Interactivity, Play, and Games: Four naughty concepts in need of discipline  by Eric Zimmerman

It was an interesting read to see someone lay out varying definitions of these 4 terms (some more familiar than the rest), discuss them in depth, highlight the overlaps and limitations before eventually tearing apart the very definitions and explanations he had built. It felt like we know these words, yet not really.

What struck me most was the point regarding play. Zimmerman urges creators not to forcefully direct a play experience but rather to design a system and structure with the potential for play. We as creators cannot incite play but can instead create an environment which encourages and supports it.

Zimmerman also highlights how there is a narrative to be found in all media, games and creations. It teaches us a new way of viewing creative products, searching for the overall stories and ‘micro-narratives’ in each move, sequence and activity.

Project idea: Month

A month is a period of time that we share universally. What happens in a month? How much of the past months can we recall? When we recall a month that has passed, what comes to mind?

Perhaps it’s characterised by events; public holidays, birthdays, one-time occasions (i.e. marriages, celebrations, funerals, world events, injuries, natural disasters). I feel that a month really flies by and the days blend together and become indistinguishable.

This project will invite users to document the most significant thing that happens each day. Users would record a single piece of media in any form (images, music/sounds, words, article/headline, website link). The system will record the submission each day but will not show posts from other days of the month. Users will only be able to view these daily submissions at the end of the month, as an entire month and collated experience. The arrangement of the daily posts will also correspond to the layout of a calendar.

Anonymity
The posts would be anonymous, with no identity, descriptions or captions. Unlike social media, perhaps not having an audience which knows us personally and only viewing the happenings of a single day would reduce the level of curation; i.e. reduce our urge to document our month according to certain cohesive themes/colours/forms etc. or document only the ‘nice sexy moments’.

Scale
Ideally, the project will have users from all over the world. The database would store the ‘months’ of users around the world. People can access these ‘months’, and have a glimpse into the lives of others. These experiences may vary from the intimate and personal to a collective experience. For example, on 23 Jan 2016, posts from Singapore may include a sound clip of a birthday celebration, failing a school assignment or a picture of a large tree. On the other hand, many people in the US may post pictures of the blizzard which hit the east coast, showing snow covered cars and streets. The significant-happening-of-the-day can be something that shook you personally, or an event that shook the larger global community.

On a side note, experiences and happenings seem to dictate the tone of the month. Hmm… I wonder if other more subtle aspects can be recorded, like feelings and moods.

where dumb ideas prosper