Material Cookbook || Chapter 1: Direct and Indirect Pattern Transfer

Materials Needed:

  • Transfer Ink for Fabric
  • polyester material, the silky kind
  • Paper
  • Fabric crayons
  • wax paper
  • Iron/Heat press
  • Flat items (Feathers, string, lace, etc)

Instructions: Dry transfer – Direct

 

Step 1: Draw a pattern or image on paper using the fabric crayons.

Step 2: Line the paper on top of your chosen material, image side down.

Step 3: Place a sheet of wax paper to protect your cloth from direct heat from the iron. This will prevent the cloth from burning or warping.

Step 4: Iron the stack. Note that the Colour will start to seep into the cloth when it is ironed properly. It also takes awhile for the image to transfer so be patient and check it constantly without moving your image.

 Reflection:

Heating with an iron is uneven and I removed the heat too early on from some parts of the fabric which results in fainter imprints.

Polyester burns and shrinks under high heat. Also the silkier the fabric, the more vibrant the colours.

Yellow on paper is very bright whilst purple on paper is almost black, yet transferring over to the cloth lightens the colours.

 

Instructions: Wet transfer – Indirect

Step 1: Paint your paper with Transfer Ink and leave it to dry

Step 2: Arrange your flat objects on top of your chosen cloth.

Step 3: Line the dried paper with transfer ink on top of your chosen material, coloured side down.

Step 4: Place a sheet of wax paper to protect your cloth from direct heat from the iron. This will prevent the cloth from burning or warping.

Step 5: Iron the stack. Note that the Colour will start to seep into the cloth when it is ironed properly. It also takes awhile for the image to transfer so be patient and check it constantly without moving your image.

Reflection:

The flat objects like string move around a lot when checking, be careful when lifting the sheets.

the transfer ink also takes longer then the crayons for heat transfer, be patient.

Instructions: Wet transfer – Direct

Step 1: Paint your flat objects with transfer ink.

Step 2: Arrange your flat objects on top of your chosen cloth, painted side down.

Step 3: Place a sheet of wax paper to protect your cloth from direct heat from the iron. This will prevent the cloth from burning or warping.

Step 4: Iron the stack. Note that the Colour will start to seep into the cloth when it is ironed properly. It also takes awhile for the image to transfer so be patient and check it constantly without moving your image.

Reflection:

In class example done by Galina. 
The positive imprint is an example of direct wet printing.

I did not have the opportunity to play with this as my string was too flimsy to make a second imprint. The ink picked up from the first wet transfer print is enough to transfer a detailed enough pattern to the cloth as witnessed in the class example.

 

 

 

 

Wearable Tech Research: Biomimicry

What is Biomimicry:

To copy the functionality of nature to solve our man-made problems.

Examples:

Stefanie Nieuwenhuys

LAYERING SCRAPS LIKE SCALES After spying diamond-shaped wood chips on a workshop floor at London’s  Kingston University —the leftovers of some architecture student, no doubt— Stefanie Nieuwenhuys was reminded of a secondhand snakeskin bag she once purchased. Scooping them up, the fashion student set to work, layering the wooden scraps onto fabric like reptilian scales. To minimize waste, Stefanie Nieuwenhuys layered discarded pieces of wood onto fabric like reptilian scales. Nieuwenhuys’s “aha” moment resulted in her master’s project: a collection of corsets, floor-length evening dresses, trousers, and neckpieces that marries modern laser-cutting techniques with a couturier’s delicate yet exacting touch. Eschewing virgin resources, Nieuwenhuys worked with bio-waste firm  InCrops Enterprise Hub  in Norwich to obtain discarded pieces of plywood, which she honed into efficient forms that left behind little waste. Glued onto unbleached organic cotton, the brown-and-ecru “scales” become a “simulacra of nature, without discarding nature’s inherent harmonies".

scales4

scales7

scales8

(pictures taken from: https://blogs.3ds.com/fashionlab/stefanie-nieuwenhuyse-recycle-le-bois-comme-des-ecailles-serie-biomimetisme/)

After spying diamond-shaped wood chips on a workshop floor at London’s Kingston University—the leftovers of some architecture student, no doubt—Stefanie Nieuwenhuys was reminded of a secondhand snakeskin bag she once purchased. Scooping them up, the fashion student set to work, layering the wooden scraps onto fabric like reptilian scales.

The artist makes use of scrap material to make her outfits. This project of hers emphasises the idea of reusing materials. Laser cutting the pieces to look like scales, and imitating the layering to look like that of a snake.

 

Diana Eng

COMPACT STRUCTURES THAT UNFURL LIKE LEAVES Diana Eng based her  “Miura Ori” scarf  on an origami “leaf-fold” pattern invented by Koryo Miura, a Japanese space scientist who was in turn inspired by the unfurling mechanism of the hornbeam and beech leaves. Diana Eng’s scarf folds into a compact package yet “deploys” to create a voluminous wrap for your neck. Hornbeam and beech leaves are distinguished by their corrugated folds, which remain collapsed until they emerge from their buds.

Diana Eng based her “Miura Ori” scarf on an origami “leaf-fold” pattern invented by Koryo Miura, a Japanese space scientist who was in turn inspired by the unfurling mechanism of the hornbeam and beech leaves.

The origami patterns were made by observing nature, and the omission of right angles, like forehead wrinkles or the veins of a dragonfly’s wing. Because of that, the pattern is collapsible.

Monserrat Ciges

Created to imitate animals that are able to voluntarily self-transform.

 

References:

Simone Leonelli on the Blurred Boundaries Between Art & Fashion

The influence of 3D printing on fashion design

3D Printed Fashion: Novelty or reality?

How to become an Aussie eco-fashion designer

Fashion Biomimicry Through The Lenses Of Biotechnology, Nanotech And Artificial Intelligence

3D Haute Couture by Iris Van Herpen

http://www.osmosis-industries.com/digital/2015/4/21/nature-inspired-fashion-design-through-the-theory-of-biomimicry

Stefanie Nieuwenhuyse Reuses Scrap Wood as Scales – Biomimicry Series

Tessellation and Miura Folds

http://www.fairytalefashion.org/

https://class.textile-academy.org/2019/Montserrat/project.html?fbclid=IwAR2HGn6Jnj_R55DxHrH0XUJ-Kps8XhIIsjuXEe7a-0vZX_qN_RzgdFmpEQQ

 

Interactive Device Project2: Skitzie the Cat

Skitzie the Cat is just your average black cat that likes to hang out on your shoulder. They are curious and like to people watch while you do your stuff. But Skitzie is very shy, hence pretends to be a scarf when anyone comes too close.

(insert hooman wearable sketch)

About Skitzie the Cat

Skitzie is a guardian for those who are not to aware of their surroundings. In a sense Skitzie’s ‘hasty retreat’ to become a scarf is a warning that there are on coming people approaching.

For this project, I had imagined Skitzie to be able to move their head and their ears to see the world. Skitzie is also envisioned to be able to ‘blink’ through LEDs and hum through a speaker. I wanted there to be sound or light as an indicator to the person who is wearing Skitzie to know very clearly when Skitzie is a cat and when they are pretending to be a scarf. The Warning has to be distinct enough to catch people’s notice.

Skitzie’s hardware

Skitzis is a combination of servo motors and a sharp

  1. Testing the Servo Motor:

Website Reference: https://www.instructables.com/id/Arduino-Servo-Motors/

2. Testing the Proximity sensor

Sharp ir arduino 0

Website Reference: https://create.arduino.cc/projecthub/jenniferchen/distance-measuring-sensor-900520

I combine the circuits and then made a head.

Testing out the Eyes circuit, it works. Turns off and on depending on the closeness.

For Some odd reason though when i add the ears, the eyes disappeared.

Then it got fixed (connections are problematic, check everythingggg).

This is the body, that I made around the head servo motor.

All Assembled.

Reflections:

Honestly the aesthetic of Skitzie didnt come out right, which I am a little bit disappointed by. Subsequently the head keeps falling off if left for too long, so I need to fix that in future. Hopefully we will see the return f a better Skitzie in future.

 

 

Refernces:

  • https://www.google.com/search?q=code+servo+motor+with+sensor&oq=code+servo+motor+with+sensor&aqs=chrome..69i57j0.11575j1j4&sourceid=chrome&ie=UTF-8#kpvalbx=_VbKEXdWDPNjc9QPCsoCIBA25

Final Project: Trio

One musician makes a Solo.

Two musicians make a Duet.

Three musicians make a Trio.

So lets make a Trio.

Three is a weird number, we only have two hands after all, so to have three sensors you would need at least another person to make sure all sensors are occupied to control the sound that is made (of course the preferable number is three, but I do so wish to see two people flailing around).

Why did I create trio? I guess it was more along the lines of finding a project that seems fun to interact with, but at the same time you struggle to make it work for you. At the end of the day the device is really just a commentary of how sometimes in life while you are trying your best to make something work, the end result does not bear fruit, or even better, bears fruit but there is not really a take away to this situation now is there?

In summary, let me waste your time.

The Circuit:

Life update:

Initially in the previous post I had made my code entirely out of Piezzo Buzzers and ultrasonic sensors, made to beep in a set tempo. Which, to be honest, is not really what I wanted. I want the tempo to be set by removing the object in front of the sensors face. But after editing the code I realise a major issue: the ultrasonic sensor is kind of being influenced by the buzzer it self.

(insert confused noises) It was steady before? But that was when the ultrasonic sensor only needed to play one tone with an interval in between, hence meaning it was not reading the environment every second hence it does not get influenced before.

So at this point it was time to consider changing my components.

Sharp ir arduino 0

Image taken from: https://create.arduino.cc/projecthub/jenniferchen/distance-measuring-sensor-900520

So I tried out the Sharp infrared/proximity? sensor. It goes by distance sensor at Continental.

Putting the entire circuit together, a single Piezzo buzzer and an Arduino together, the circuit works.

However, duplicating the code becomes a bit more troublesome. As it turns out, you cannot fit 3 piezzo buzzers to a single Arduino and expect them to ‘sound’ at the same time (Trouble shooting this the night before and realising you have to duplicate the circuit and not the code is really bad for your health by the way).

I did not account for this tripling of the space needed to store my three Arduinos, three distance sensors, three piezzo buzzers and three power packs. It is a super tight fit, mind you. By right I can squeeze everything into the box,. But by left squeezing very thing in the night before the showcase and then snapping something is not an ideal situation.

Piecing them together:

From the last post I showed that I laser cut, filed and spray painted my pieces, accordingly:

Since that time, I had to redo the black pieces, to fit the Sharp Infrared Sensor instead of the Ultrasonic sensor. Hence cut a rectangular piece instead.

I have also cut some wedges to support the structure.

And I stacked all my pieces up and glued them together with a glue gun.

Finally I stuck my components inside the surface.

Reflections:

In this case I have achieved what I wanted to in this project, the three sensors play according to the distance of them and the object. However maybe in future I would like to be a bit more ambitious/annoying and make the circuit for more (time to make a symphony for one then).

Of course there are a bit more things I should have accounted for, like the size of the circuit, or the potential increase of the circuit.

Device of the Week 2: Iot

Image result for kuri mobile robot

Kuri The Mobile Home Security Robot by Mayfield Robotics:

Description

Kuri is an adorable home companion that acts like a ‘living’ robot. At first I assumed the Kuri was going to function like a google home device on wheels, but Kuri is slightly more than that as they make certain ‘expressions’ that make Kuri feel more alive. Krui has the ability to smile at you, follow you around and ‘speak’ to you. The adorable robot has an inbuilt function to track your motion and look up at you, and respond to its name with beeps and chirps. Subsequently as a home security device, Kuri has tiny cameras located in their ‘eyes’ to capture clips of whatever that goes on at home. Kuri also accting like a home device has the ability to answer certain questions that you ask, like ‘is it going to rain today?’ and they will shake their head with an adorable beep.

Kuri is also described to be a good nanny and entertain the kids, but so far, other then following them around and animating expressions, I am not too sure how kids will find Kuri entertaining.

Functions

Microphone: Voice Recognition to answer questions or comply with requests

Speakers: To ‘speak’ in chirps and beeps, to play music and podcasts found on internet.

HD camera: For security footage, and allowing live streaming.

Asynchronous motors: To allow Kuri to move around the house, Kuri also has sensors that will allow them to map the house, and not bump into objects

Capacitive touch sensor: For Kuri to recognise and react to human touch.

Pros

  • A mobile security system that patrols your house
  • Companionship
  • Responds to all commands intelligently
  • Adapt to your environment easily and recognize people’s voices, and differentiate people from pets/other Kuri bots.
  • When in need of recharging they automatically returns to their charging station for a power nap.

Cons

  • Kuri cannot climb stairs, sorry landed property folks
  • Kuri also has no way of helping in the case of an emergency (except inform you through messaging)

Analysis

Kuri really is just a mobile smart home. Since it already has most of the functions of an ordinary smart home device with the added ability to move about. Looking at the funstions that it has other then the surveillance function, the ability to move about is a bit redundant. Subsequently since Kuri is expensive without much additional functions of the usual google home device which is priced at less then $200 roughly it is really not an attractive product, which is the probable reason for the closure of the company last year.

Considering the company was focusing on the ‘animated’ part of Kuri to make them more alive, that is probably where most of their innovation went to. ( I wonder if this is the case, eliminating the animations, if it cheaper, easier and a more viable solution to add a google home device to a roomba. Obviously not as cute, but functionality wise will it sell better? )

References:

  • https://www.jameco.com/Jameco/workshop/ProductNews/life-with-kuri-a-real-live-robot.html

Artwork Review: Inter-Mission

The lapse project is a collection of five projects, all revolving around the theme of lapses in time, in memory and in realities. Whilst not being able to physically be in the preence of these art works, reading up on it I was able gather an impression of it.

The projects:

  • VR Lapse
  • Particle Lapse
  • 24 hour Lapse
  • Panorama Lapse
  • Journal Lapse

Within the projects the last two, Panorama Lapse and Journal Lapse are not interactive, hence I will not be discussing them.

VR Lapse is a virtual reality simulation, bringing the audience to Singapore’s oldest colonial building, only to find out it is digitally erased.

Does Out of Sight, Out of Mind in Singapore leads to Nevermind?

Quoted from popspoken.com, during their interview with inter-mission shares the artists concern with how significant Art related artefacts in Singapore are slowly being washed away with the ever changing landscape.

With that message in mind, I wonder if the project works on someone with no context of the place at all. It is true that these are cultural landmarks, however I am left drawing a blank when someone tells me ‘Art House’.  They were trying to trigger this idea of misplacement, the ‘I am pretty sure there is something missing here’ sort of thought, but if there was no recollection of the place in the first place, can this idea still be drawn out? Does that hinder the experience of the work.

Subsequently since we are discussing the idea of interactivity of a work, I feel that the interactivity level is quite  low. Being placed on empty, unchanging landscape, with nothing to influence, is passive like watching a movie or a slide show.

The second project adds to the atmosphere of the first. Particle Lapse is more interactive in a sense that it is using the movement of the viewer and creating a feedback sound/atmosphere for the audience who is traversing the virtual world, giving them the extra dimension of sound that is meant to confuse the audience. In this case there is a contributive element that the audience plays in the artwork.

Finally there is 24 Hour Lapse which is an installation where visitors from the past 24 hours are projected alongside the present visitors on a CRT monitor. It is kind of interesting how they play with the idea of people from two differnt times sharing the same space, even if it is only a screen. However in terms of interactivity, it is again quite passive as the present visitors cannot influence the already pre-recorded video.

Overall the Lapse project is not a very interactive project. It works more as a stage which is the artists mind, and the audience is the audience, not the participants on the stage.  As such we only view their feelings and experience for the idea of lapse in memory, which is not always universal hence abit hard to relate to.

Final Life Update: Hear/Here (Colours of the Wind)

Video:

Short Essay:

This project revolves around the idea of the gaps between noise/sound, hence we created a portable device that will sample the overall surrounding sound and in response would light an LED in a corresponding colour. The colour is based on a calculation where ‘red’ is volume , ‘green’ is pitch (regardless of octave) and ‘blue’ is pitch (exact octave). Red and Blue were scaled to fit a range of 0 to 255, however, for the Green there were 5 ranges created, skewed accordingly so that the range for a humanly possible pitch is larger then a not humanly possible pitch. The code makes use of an array to store data in each pixel, until all nine pixels have been used up, then the information would be overwritten for the following pixel.

References for the code:

  • Origin of basic-ass code (which is no longer here): https://www.teachmemicro.com/arduino-microphone/
  • Origin of getAmplitude code: https://learn.adafruit.com/adafruit-microphone-amplifier-breakout/measuring-sound-levels
  •  Origin of getFrequensea code: https://www.norwegiancreations.com/2017/08/what-is-fft-and-how-can-you-implement-it-on-an-arduino/
  • Origin of NeoPixel code: https://learn.adafruit.com/adafruit-neopixel-uberguide/arduino-library-use

 

Our work takes reference to works like ‘Pulse Index’ by Rafael Lozano. It is similar in the sense that it takes record of the viewers in put, in their case the thumbprints, in our case sound, and record it on a visual plane to show the changes overtime.

Rafael Lozano-Hemmer, "Pulse Index", 2010. "Time Lapse", Site Santa Fe, New Mexico, 2012. Photo by : Kate Russel

Characteristics of Interface:

Classification of interface:

Our project falls under ‘User is one of Many’ and ‘User is valued’. Our project values the unity of the environmental sound and how your sound is captured in this collective and you cant discern what is your sound and what is the environment, hence the user is one of many part. However, the user is valued is also present in a way that they are the anomaly that created the most change when they interact with it directly.

Characteristics of interface:

Our project falls under ‘Monitered and reflected experience’ as well as ‘Intuitive selection/results relationship’. For the former, the device is to collect the environmental sound and show a colour represnetation, hence all interatctions are copied and shown directly based on the sounds that you make. The latter is true as when you see the light changing to sound, the viewers will automatically try to interact with it to see the extent that it will change to, hence creating the result of trying to find the gaps between the sounds you make when you see the different coloured representations of each instance of sounds made.

Structure of Interface:

Based on the flow chart, our Project complies to everything except the last one ‘Linear Selection’. The first idea of open structure is seen in the way we made our device portable. The second idea of ‘Feedback provided’ is done so in the form of LED lights lit in accordance to the sound of the environment/people within the environment interacting with it. The third idea is ‘Constant elements providing continuity’, since the set up is designed to reflect the sound at every (how many seconds). Finally selections are recorded in nine LED pixels, showing 8 seconds of the recently past environmental sounds.

(Liz finally answered the question yay)

Who did what:

The coding for this project was done by En Cui and the physical fabrication of the device was put together by me (Elizabeth) (but you know in the end Liz kind of screwed up alot of the soldering and stuff and needed En Cui and Lei’s help to put them together. Thank youuu)

Process:

From the initial stage of mannually making LEDs light up by pressing the buttons whenever someone made a sound we created a circuit where the LED would light up in a certain colour according to the environmental sound.

After that we used this circuit as a a reference and moved from a single RGB LED to a strip of LED wire. That way we could create a set up where the colour of a certain period of time could be recorded and compared to the pervious period of time.

yay the LED lights up.

Measuring the length of wire for the glove.

This is where problems started surfacing on the soldering part so there was a redo. (soldering wise and circuit wise sob)

Testing out the Circuit.

Yay it’s done.

After Review:

Everyone reacted to the work as we hoped they would despite only having two participants. They crowded and tried to put in their own input by making noises around the two. Though we have coments that the feedback is not fast enough to show the exact inflection of voice as one is speaking, hence not very obvious. We forgot to mention this during the review, but the delay is also constrained by technical limitations. If we reduce the delay, we will need more LEDs to represent the same amount of time, and the Arduino memory overloads at 13 LEDs. Additionally, even at delay(0), the Arduino still cannot function fast enough to get the desired result:

As a result of the delay, our theme in this work might not be very obvious to the viewers to pick up on as a result. The eventual solution may thus be to use something with more processing power.

There are comments on how they are working very hard to satisfy the device as well. Some say that it seemed like a prop for band or choir performances, or a tool for training how to get the exact pitch.

Summary Reflection:

EC needs to actually know when it’s not possible than maybe possible.

Liz should not be so innovative. Liz is just not good with technology.

We should have thought out the final form better.

Extended Concluding thoughts (if you want to read about our woes):

En Cui’s Reflection:

Concept-wise, the challenge was that the core concept and form were not well-aligned. While we talked out several issues, there’s still the challenge of the interstice being unclear. But I think, in the end, the clarity of the message depends on how you interact with the wearable. For example, the distinction is much clearer if you experience the wearable in multiple contexts, than just one.

Regarding the code and circuit, it was actually mostly okay. While things didn’t always work, the only solution needed was to observe the problem, deduce what could be possible reasons for its occurrence, then test out my hypotheses one by one. Examples include mathematical errors and faulty wiring. I also did soldering part 2 for the microphone, and honestly the solution was just learning to recognise patterns of problems and solutions based on past mistakes, such as the solder not sticking to the iron (wiping more), or getting fingers burnt (plasters).

I also realise after a full day of reflection that I’m just incompetent at doing group work efficiently. Leaving me in charge is a generally bad idea.

Elizabeth’s Reflection:

For the most bit I felt very challenged by the project, especially since it is the first time we were using and putting together components to make a circuit. for the physical fabrication portion it was the first time I used a solder, and my circuit looked very ugly after that, and I dont really think I improved in that aspect very much even after multiple attempts 🙁 When using the Hot glue gun to insulate the exposed solder I think I made the circuit worse, because there was already a built up of solder.

Also, I did not solder the circuit down the right way apparently. You can only solder your wires to one side of the LED because they are fickle and like to have their electrical charge flowing in one direction. Also, do not solder and hot glue your circuit till you are 100% sure it works, saves you a lot of heartpain and time, (thank you Lei and En Cui for dealing with my screw ups D;).

I also made a few mistakes by piercing the LED strip’s digital pins on accident thinking I can sew it down that way. Thinking about it now, I should have known better then to try piercing any part of the components.

Speaking of computer, I feel very attacked by my own computer, since I think it has issues running the code we shared over google docs, and gave me a heart attack that I might have short circuited the only RGB LED in the starter pack, and still the circuit refused to light after I confirmed that I did not. I think there is something wrong with my computer DX. I either leave the testing for computer to En Cui or find a school computer for this (pick the right computer for this, not all computers have arduino).

If we had a bit more time and I had a bit more skill in soldering, we wish to have more LED lights to reflect the change in sound.

 

Final Project, life update

From last Week’s flow chart, En Cui and I worked on creating a mock up circuit which follows a segment of the flow chart.

Mock Up model 1

We started with a basic microphone set up.

From here we tested to see if we can get the input reading through the serial monitor of the surrounding sounds, and the changes when we spoke into the microphone.

Problems:

  • Serial Montitor showed that the input was either a ‘high’ at 1022/1023, or a ‘low’ 0.

Conclusion at this segment:

  • We thought our microphone was iffy

Nonetheless we continued, as the microphone was still able to detect sound we decided it will be good enough for now and we will solve this issue later.

Mock Up model 2

Subsequently, we added onto the first model to include the LED output.

From here the code was expanded to include a code to control an RBG LED and to read frequency and Volume of the surrounding environment. Initially, the code was done in  random way where for every 3 digits that the frequency had the digit in the hundred place would be the percentage of red, tens the percentage of blue, and ones for green, that would make up the colour that the light bulb would create.

Watch Video at:

Problems:

  • The colour of the lightblub was coming out abit too randomly

So from there we attempted to group a range of frequencies and match them to a colour. Subsequently we made it such that the volume is matched to the brightness of the LED.

Telok Ayer publishing a Zine

Finally with all the illustrations in place, I decided to compile them and add some of my thoughts onto the pages.

Page one was given a title. Looking back at it now, maybe i should not have put my name there, kind of ruins the effect of the cover page. I wanted the title for this Zine to reflect how confused I am to describe Telok Ayer uniquely.

The second page reflects the food that you get there, I wanted to put it in a sort of menu like setting, hence added the borders.

Page 3 i felt was quite self explanatory, how there are random statues placed around like tourist attraction, and everyone does just that in its presence.

Page 4-5, I left it be.

Page 6, I remembered looking at the shop houses during one of my trips there and wondering what the definition of preservation is. Whilst i believe it is good to re purpose these old shop houses, I think the area of Telok Ayer also begins to lose its uniqueness as it houses more modern shops now.

I let page 7 be as well.

page 8, this was a statue as well, I remembered looking at it ad saying this was probably something iconic to Telok ayer in the past. Now they can only use it as an odd tourist attraction, hence this sort of cynical reply at the end.

I did not have much time to print as I spent a lot of time redesigning the illustrations on Illustrator, but i ended up printing it on art card. Which i felt was sturdy and reflected the sold colours and lines of my Graphics well.

   

Though maybe i could have explored more on printing.