Recess Week Updates

Progress Thus Far

  • Done research & reading on Cyberpunk genre
  • Not done moodboard (rescheduling to another time to focus on, as it’s not important now after reviewing)
  • Sorted out and filtered journal papers that are relevant
  • Made eye-to-eye periscope, working on vibrawatch prototype
  • Read Sensory Pathways for the Plastic Mind by Aisen Chacin
    • Realised I’m doing something very similar to what she is doing, but with different intentions. She wanted to critique on the idea of sensory substitution devices as assistive technology, and want to normalise the use of these devices. For me, I want to go beyond normalising, I wanted to pretend that it is already a norm, and imagine what people do with it.

Next focus

  • Finish vibrawatch prototype
  • Sketching of ideas so I can present them and visualise them
  • Play with creating said ideas and test them
  • Read when I have time
  • Focus on sensory experiments and exploration with current few ideas

I have changed my focus from research to hands-on after talking to Bao and Zi Feng who gave me a number of good advices!

Cyberpunk

I went to research a bit deeper on cyberpunk genre. This is done through reading Tokyo Cyberpunk: Posthumanism in Japanese Visual Culture which, to be honest, was not very helpful as it was a lot of text analysing deeply on different films or animes that tackle different cyberpunk themes. What I found useful were the references to certain films or animes that I can search online for video clips or photos to refer to. This is speaking in terms of the aesthetics.

After reading the book, I realised there are more to the genre than just it’s aesthetics (I mean of course, just that I didn’t really think about it). I went to watch some videos on the topic and came up with some insights:

  • Cyberpunk genre revolves around themes such as mega corporations, high tech low life, surveillance, internet of things, identity in a posthuman world (man / cyborg / machine, where is the line drawn?), humanity and human the condition in a tech driven world.
  • The aesthetics is not a main point of the cyberpunk genre, as previously thought. It is all about the themes, that’s why many films that are “cyberpunk” may not have the iconic aesthetics of neon holographic signs, mega cities, flying cars, etc. Cyberpunk genre can exist in the era we are living in now and does not necessary have to be set in the future.
  • It might be more realistic to look at the current world and set the “look” accordingly rather than to refer to the 80’s vision of the future aka what most cyberpunk depiction look like.

So why are these info useful for me?

I think it’s good that I actually try to discover what cyberpunk really is and at least get a brief idea of it. I can try referencing certain themes that apply to Singapore and see what works best. How will Singapore be like if we are set in a cyberpunk world? This does not only affect the aesthetics of my installation, but it also affects the theme I’ll be covering, which can narrow my scope a bit more. In the end, I hope that the installation will be realistic and aesthetically convincing as well.

Also, I got my hands sort of dirty

I had the idea of a periscope that lets you see your own eye. This is more of a perception experiment than anything really important.

As a thought experiment, I was wondering what we will see, if one eye sees the other, and vice versa. Turns out, nothing special. You will see your eyes as though you see it in a mirror — except the mirror is not flipped. I can’t think of an application, but it is interesting to play with it. If you close one eye, you can see your closed eye with the other eye.

I show this to Zi Feng and Bao, and Zi Feng mentioned that if the mirrors are the scale of the face, you can see your true reflection with this device, with only a line separating the middle, which I think is an interesting application for a mirror that shows your actual appearance (how other people sees you).

But yeah, it’s not anything super interesting. I just wanted to start working on something haha.

Anyway, I’m also working on the vibrawatch, but it’s still WIP.

What’s next?

I’m gonna start experimenting with making sensory prototypes. I’m also starting on a device that responds to weather immediately after I’m done with vibrawatch.

Metropollutan

Metropollutan

Metropollutan is a wordplay on two words “Metropolitan” and “pollutant”, signifying the idea of a citizen in a city surrounded by pollution.

In the distant future, global warming have desertified Earth. Temperatures are always high and dust storms frequently hit urban cities, forcing its population to adapt. The garment is an imaginative reflection of the clothings that people in this dystopian future wear, referencing elements from desert wear and the clothings from Cyberpunk genre in Science Fiction.

The jumpsuit is the innermost layer of the garment, and is made of a comfortable and breathable fabric which helps the wearer feel cool. The sleeves are made broader to make it more cooling, which can also be buttoned off when a dust storm hits.

The poncho is the next layer, covering half the wearer’s body. This is inspired by the garments worn by desert dwellers. The poncho can be further extended to cover the wearer more if needed.

The outermost layer is the armour, which is used to protect the wearer and to perform functions such as detecting environmental changes, and responding using lights and motion. The armour pieces are also inspired by beetles in the way they are designed and the way they are segmented. For example, the arm pieces look like beetle legs while the shoulder pieces look like antennas. The armour also has an iridescent shine like beetles, which also coincides with cyberpunk aesthetics.

The garment is thus able to allow wearers, the Metropollutans, to adapt to the dusty yet urban environment.

The different parts and their functions:

The hood and the mask helps the wearer to protect their face against dust. When high pollution is detected, the motor moves and the mask comes up to cover the face.

The shoulder pieces are called the “Illumunators” which lights up into bright green light when there is high pollution so as to increase visibility.

The chest and neck piece houses the electronics and helps keep every other armour piece together.

The arm pieces help the wearer to visualise the environmental information in the form of light. The left arm piece responds to pollution changes, where in low pollution, it glows in blue and green, while in high pollution, it glows in pink and purple. The right arm piece responds to temperature changes, where in average temperature, it glows in blue and green, while in higher temperature, it glows in orange and red. These changes notify the wearer of the changes in environment, while also preparing them for the change in the garment pieces’ movements.

Finally, the belt allows the wearer to hold more things as it acts like a utility belt.

How it works

The garment uses Internet of Things (IoT) services to collect, send, and receive data to alter the state of the garment. Data is collected from Air Visuals, a weather data API. The data is sent to Adafruit.io, which stores all the data. The stored data is then sent to microcontrollers, NodeMCU Amica, inside different parts of the garment, controlling different LED strips and servo motors.

As IoT data do not reflect the imagined world’s environment very well, the garment is also controlled by a switch on Adafruit.io which changes the state of the garment. There are two states that works on my garment currently: high pollution and low pollution. On high pollution, the mask will move up to cover the wearer’s mouth, the Illuminator will switch on, and the arm piece colour will turn from blue/green to red/orange; vice versa for low pollution.

Photos of final state:

As I didn’t have the tools (and drive 🙁 ) to continue working on the garment, this is the final state of my work.

Current state: With a piece of armour

Additionally, I’ve worked on the mask and the electronics at home.

Mockups, Sketches:

To make up for the lack of finalised project, I used sketching and mockups to imagine the way the armour pieces look.

Video of Electronics / Interaction:

 

Role and Future Aspiration in Fashion Show

I would remain in the role I was assigned, doing the stage. I wish to do projection for the show, and had discussed with Shah before about it and we think it’s a good idea. I think we can project the quirks of the different garments, like for example, mine could be a desert world, while Fizah’s can be snakes.

The stage design will also include logistics like smoke machine placement, decorations, etc. This should tie in with the theme, but that’s undecided yet.

Week 11 Wearable Tech Updates

Last week, I was drawing this to collect my thoughts and it helped me figure out the logistics of my tech

Last week’s electronics setup:

LED + wifi
Servo + wifi

Last week’s progress:

I managed to complete the jumpsuit with help from Galina for the zippers and connecting the 2 pieces together!

Galina also taught me how to sew button holes which was very fun!!

What’s left for the jump suit is for me to hand sew the buttons, to create button holes for the sleeves, to add front pockets, and belt holding thing. But that’s not my priority now.

This week, I’m working on the cape and armour pieces.

This Week’s Progress: (incomplete as theres more days this week to work on this!)

I was playing with the materials to create the look of desert beetles, but it failed so I will just stick to the pvc without material alterations.

putting pins before heating it to create round bumps but it ended up folding
only heating the PVC creates this effect

Cutting and putting things together: I was experimenting with the materials and turns out its more flimsy than I previously thought. It’s also much thinner than I thought so it doesn’t look as good if it’s single layered. Sewing 2 pieces together makes it look better, which means now I need 2x the amount of PVC ($$$$$$$!!!!!!) but at least I have 1m left of the pink piece which I’m not gonna use as pink contrasts too much with my garment. I’m switching to green instead.

What was previously the neck piece was tested on the shoulder to see how it looks like as a shoulder piece.

I was having a hard time trying to figure out which cloth to use. Using both cloth together with the armour will make the whole look very weird. Too much is going on. Using only 1 cloth and the armour works, but I need to decide which to remove. In the end, I removed the metallic mesh cloth.

The final decision (except the pink parts will now be green)\

I also started sewing the cape piece to make it more frilly / have pleats. So yeah. (no pics tho)

So in conclusion, this week was more of a start of the next phase, and I am more or less confirmed for the entire design and I’m happy with it. I’m gonna continue the cape till I’m happy with it then I’ll start with the armour which is gonna be a pain 🙁 and I need to go buy the PVC also. Hopefully finish this by week 12 end so I can finish the electronics and have time to finish up the small details. And also work with Shah with the fashion runway!!! AHHHH!!!

Oh and also Mayle was working on a PVC vest and I took a pic of the pattern so I can tweak it (thanks Mayle!!!)

it’s actually just a pants pattern but if you drape it over the shoulder it becomes some kind of vest which looks like my chest armour!

My sketch of how I think it will look like:

The perfect vision of my garment

Ok out of topic but I think what helped me a lot was to imagine my concept as an Overwatch character. Anyway, here’s an updated concept:

Concept: nomad from a future dystopia city which is covered in dust from sandstorm due to desertification and global warming. Cyberpunk elements (iridescent armour + tech) to create a sense of urban-ness (because he’s still from a city) and sandy garment to blend with the desert city. A bit of inspiration from beetles also (in terms of their colour and appearance)

ALSO… I’m thinking what if the controller on my belt can control which city to look at. And because different cities have different temperatures and pollution levels, the values will cause the garment to react to it. Hmmm…

Moodboard (Updated!)

Update: After biomimicry research critique, I kinda wanna change some of the elements in my design to make them more sustainable rather than just copying a certain retro-futuristic style (even though I love it).

Also: I REALISED I NEVER EDIT THE KEYWORDS PART SO IM GONNA WORK ON THAT.

Pitch Proposal : THIRD-I

https://oss.adm.ntu.edu.sg/a150133/category/17s1-dm3005-tut-g01/

https://unnecessaryinventions.com

 

Updates:

Design:

Tests:

The Interaction

  • Tracking mode
    • IR sensors will detect difference in distance
    • Motor will turn to track the object, prioritising whatever’s closest that’s in field of view
    • The info of motor angle, IR distance will be sent wirelessly to Adafruit.io
    • These info will be received by the cowl, vibrating the corresponding vibrating motor to indicate sensor’s location
    • * A pitch will be heard on either side of the ear to determine proximity
  • * Sweep Mode
    • IR sensors will detect its surrounding like a radar
    • Motor will turn 180 degrees to and fro.
    • The info of motor angle, IR distance will be sent wirelessly to Adafruit.io
    • These info will be received by the cowl, vibrating the corresponding vibrating motor to indicate sensor’s location
    • * A pitch will be heard on either side of the ear to determine proximity
  • Eye movement Mode
    • Motor will turn in the direction of the eye, stopping when the eye is looking forward.
    • IR sensor detects proximity
    • The info of motor angle, IR distance will be sent wirelessly to Adafruit.io
    • These info will be received by the cowl, vibrating the corresponding vibrating motor to indicate sensor’s location
    • * A pitch will be heard on either side of the ear to determine proximity

Communication flow (1 cycle)

  • Cowl’s Arduino sends mode info to 8266
  • Cowl’s 8266 sends mode info to eye’s 8266
  • Eye ESP8266 subscribes mode info
  • ESP8266 sends the info to its slave lilypad which acts accordingly
  • Lilypad receives motor and IR distance info and sends to eye’s ESP8266
  • Eye ESP8266 publishes motor and IR distance info wirelessly
  • Cowl’s 8266 reads the info and sends to its slave Arduino
  • Arduino react its vibrating motor and pitch accordingly

What’s next?

  1. Receive ezbuy stuffs so can continue
  2. Settle ESP8266 communication
  3. Settle master-slave communication
  4. Get necessary materials for new eye that’s bigger in size
  5. BUILD THEM! And simplify the difficult parts