Weekly-ish Updates

Things done:

  • Finished reading Speculative Everything, should have done it earlier lol
    • Better understanding of how a speculative work should be like, and how it is approached
    • Some parts I don’t agree with, but I will still respect the ideas.
    • I will be applying some of the new knowledge into my thought process
  • Sketches of the form of Chronophone

  • Tried wire prototyping with the forms
    • Able to get the form in 3D well, I’ll continue this prototyping method for future model testing

  • Tried more user testing with the sound
    • Worn it for about 2 hours per day
    • Able to remember vaguely 3 types of frequencies: 0~10mins, 30~40mins, 50~60mins
    • I think it works enough (backed by research too) to say that it works. Mark says I don’t have to really do all the tests since its not really crucial when I show it to others, which I think is quite true..
  • Exported 24 hour sound, proving that it’s not too taxing on my computer to export more of such files.

Next steps:

Priority:

  • 3D model (having some difficulties here šŸ™ )
  • Heeding valuable advice from prof and Mark and modifying my project so it’s better

Not so important for now:

  • think further on the sound — doesn’t have to be the tones, it can be more imaginative and customisable
  • I want to still think about the form as I don’t think this fits very well
  • Now I kinda don’t like that it’s referencing cyberpunk aesthetics and IĀ  want to try to form my own aesthetics (after reading speculative everything)

Focus now will still be the 3D model and getting the form out so at least I have something to work with, then I’ll work on the other parts. Doesn’t matter if I don’t use it in the end, I think.

011 / 012 – FYP Progress

Chronophone – Speculative Chronoceptive Headset

The device provide its user time information through a constant stream of sonic feedback, using the sense of hearing as a substitute for a sense of time. Information transmitted gets processed in the hearing part of the brain, through prolonged exposure, one will be able to make sense of the time/sonic information, similar to the feeling of being in the water after a long day of swimming. The device is worn on the head, with the sound transmitted through bone conduction to instil a better chronoception. This helps us have an internalised sense of time, instead of us having to rely on watches, clocks, and alarms.

The concept behind this device is my take on a future new line of wearable products that trains us to tell time instinctively. The device is also an ultimate timing device that can train our rhythm, timing intervals, etc.

Why time?

Time is a very big part of our lives, controlling our day and our activities. We wear wrist watches and install clocks in our homes and offices so we can know what time it is. I ask: “how can sensory substitution help us bring this further and improve the way we tell time?”

Circuitry for my device, made using Fritzing

Properties to play around:

  • intervals
  • length of tones
  • number of reps
  • pitch tone
  • clicks
  • vibration?

Various combination of feedback I think will be good:

  • Constant interval; constant tone length; varying pitch tone; varying reps
  • Constant interval; varying tone length; varying pitch tone; varying reps
  • Constant interval; varying tone length; constant pitch tone; varying reps (like morse code)

Also, I think seconds should be the one varying in reps & tone length (and maybe pitch?), minutes should be varying in pitch and have intervals, hours should be varying in pitch but more of a background constant hum

Experiments:

Constant intervals between seconds, minutes, hours, days (S, M, H, D); constant tone length; varying tone

  • very distracting and slow convey of information
  • very washed out information after prolonged exposure: the minute and hour tone isnt very obvious

… … …

Week 11/12 (which week is it now??)

During consult with prof last week, I proposed to experiment with TouchDesigner instead of a buzzer. Doing so allow me to stack the tones which makes it easier to have a feel of how using the device will feel like.

Just to recap, I think the ideal sounds will be:

  • Droning low hum for hours — signifies the length of an hour
  • Mid-low tone for minutes — something neutral and not too distracting
  • Clicks for seconds — Much like clock ticks which we are used to, and is not super disruptive.

With this, I set off to use clock and Audio Oscillator CHOP to create the tones. (BTW I’ll use an audio spectrum to determine the specific frequency if I can edit the code before showing this to y’all so if you see this message it means I haven’t done that) (for now I think the unit is in beats per seconds? I think?

  • For Seconds, I used pulse on frequency of 1bps, resulting in a single click every second. This is not ideal as I can’t customise much in terms of the properties of the “click” sound but for now we will use this.

  • For minutes, I used Sine (was thinking of Gaussian also) waves and a frequency of 100 ~ 300, ranging from 0 to 59 minutes (within 1 hour). The effect was a nice tinnitus-like sound which I think isn’t too bad since it sounds quite natural. I try to keep the tone as clean as possible so it doesn’t get too distracting, so Sine and Gaussian works best (but I chose Sine since it’s cleaner)
  • I think the only issue I have with this is that everytime it transits from 59min to 0min, the sound changes drastically. I think this can be good and bad — good because people can immediately tell that it’s a new hour. Bad because it’s too drastic.
  • Maybe I can do a lag (or I forgot which function?) to transit the sound smoothly

  • For hours, it’s more complicated. I was trying to map the sounds from 0 to 11hours, then from 13 to 23hours. But that will mean that the frequency of the droning sound increases from 12am till 12pm, then decrease after 1pm which seems unintuitive.
  • So I used a bunch of math and logic CHOPS to work around it and set a suitable time: the frequency increases from almost inaudible droning from 4am till 3pm. This makes it better cos I think the droning should be the most minimal when sleeping, and most obvious when awake.
  • from 4pm till 3am, the droning frequency decreases, just as the day goes into night.
  • The droning sound is Triangle waves with frequency set between 20 to 100bps so it gives a nice machine-like whirling sound which, to me, feels good to hear as it’s similar to machines whirling in the background. It’s kinda ambient to me so I feel that it’s appropriate.
How the bone conduction headphone is worn
Shah’s bone conduction headphones
Buttons on the underside

Thoughts on this

I think what I can do next is to get a earpiece and wear it for a while, actively learning the frequencies and associating them with the actual time. I want to see what happens.

Any feedbacks on the sound? I want to finalise it a bit before testing it, anyway it will take some time for the headphones to arrive so I can refine the sound.

Also, to work further, I can add more functions (I will talk more on that later)

Application

Problem

So I think one big question is: what if we are in a place with loud noises? Will this device be useless? I borrowed Shah’s bone conduction headphones for a while and found that it’s not super effective too. I think this will be something I need to experiment and work on.

Thoughts

Maybe to market my device as an implant rather than a wearable which will make more sense. But I think within the speculative setting, my device seems to fit in.

Also, I can safely say that for this device I don’t need to use Arduino at all anymore. If TouchDesigner or any desktop program works, why not just use it? Since it’s speculative, I don’t need to explain how it works outside of the bluetooth pairing. It could be a manufactured internal timing chip linked to GPS timing, with its internal smart system blah blah blah future technology.

The most I see myself working on is to create buttons that can be used for timing and other functions within this device, connect it with a wifi module and batteries, connect to Adafruit.io which connects to the main program (TouchDesigner?), bluetooth paired with the device. This works within the exhibition, I’m happy.

Building

I’ll be ordering some cheapo bone conducting headphones I found online and test them out. If they work, I will reskin them with a sick cyberpunk casing instead of building a whole new thing. Since it is bluetooth, I can pair this with a laptop and run this within the exhibition and users can wear it and walk around the room.

Sounds simple… I hope…

Further Thoughts

Anyway, I’m considering developing this device as much as I can as I see the potential. From here, I can add more functions like timers, or different sound modes, and I can add more sounds which I found interesting within just Audio Oscillator CHOP alone and I think there are more options to play with.

However I don’t want to overpromise also, so this is still just a thought.

Some other research

For another module, I had to write a short paper on anything related to design, so I wrote on my research topic and started from the root of my idea inspired by Neil Harbisson, Moon Ribas, and the Cyborg Arts / Cyborg Foundation. After revisiting the research I got more clear of my device as I think their motto resonated with me and my concept.

I also discovered that Neil had created something I’m working on now which is a headpiece that tells time.

Image taken from https://thoughtworksarts.io/projects/time-sense/

https://thoughtworksarts.io/projects/time-sense/

Only difference is the interface: his uses heat around the head, which can be a little crude since heat isn’t super precise in my opinion. Mine is pitch tone which is more complex since I have to map the appropriate pitch to the hours, minutes, seconds. But seeing this makes me feel more validated since someone tried doing this before, it could probably work. And also looking at how little wires I’ll probably use… (actually none?!) makes me happier šŸ˜€

Moon Ribas also did something called Kaleidoscope Vision where she wore goggles that allowed her to only see in colour blobs. She also did a thing called SpeedBorg where she creates a earring that detects surrounding speed and returns vibration feedback. She was then able to learn how fast people around her walks. Her learning to use the goggles and reframe her perspective, plus her being able to learn the speed of people around her and using it as art, gave me hope that my device can work if I test it by wearing it myself.

Image taken from https://www.lupiga.com/vijesti/katalonci-cudnim-zvukovima-odredili-boju-zagreba
Image taken from http://www.weltenschummler.com/tech-science/rp13-how-to-become-a-cyborg/

In short, I’m trying to create a device that turns people into cyborgs.

Just a small update to strengthen my concept.

Plans for next 2 weeks

  1. Order the parts I need
  2. (refine?) the sound
  3. Once done up, start wearing the device while syncing it with my laptop and actively learning the times
  4. Update Notion page (man I always get anxious when I don’t plan but it’s the least productive thing I ever do and I don’t really follow my plans anyway…)
  5. Start working on next product: Weather Sensograph… or the money sensor
  6. Presentation slides for second interim
    1. Refresher short intro on concept
    2. Progress documentation(?)
    3. Moodboard and sketches(?)
    4. Current prototype
    5. Timeline ahead

Small mental note

Anyway I’m happy with my current understanding of my project. Although still a bit dissatisfied that my points are still all over the place, and it’s hard for me to pinpoint what exactly is the crux of my concept and what backs it up, I’m no longer struggling with the thinking part of the project… At least for now. I hope I’ll be able to make sense of everything when I write my report šŸ™

009 / 010 – FYP Progress

What I have for now:

Ok the past 2 weeks I was busy with other things so I really apologise for the lack of progress šŸ™ I’m finding it hard to cope with workstudy and side hustle, and I know it’s affecting my FYP. I’m making effort to reduce all these external commitments and focusing on FYP.

I have not followed through with making more prototypes, sketching, thinking about ideas, or even researching.

Anchorwatch

My only real progress was when I made 1 Anchorwatch prototype which I’ve shown on Wednesday. This was done last week.

Thanks Shah for allowing me to use his filament and 3D printer!

oops the part broke
the case for the Arduino Nano, which doubles as a cap for the part that houses the batteries
strap to improvise the lack of fitting

After building this, I think it won’t work because the results feels like it’s too direct to me, like it doesn’t feel impactful, doesn’t feel convincing enough. As mentioned early in the semester, I’m doing this just for experimentation. Wearing this for a while didn’t make me feel confident about it. It’s wobbly and its haptic feedback is weak. Even if it may work, it’s still a “copy” of Auger Loizeau’s Subliminal Watch. This definitely need more work to develop further, but I’m thinking of a better idea already.

Instead of continuing to develop Anchorwatch, I want to apply the same concept of internalising Chronoception through the “Sound Watch” as mentioned in my previous post. This is, to me, a clearer and more multi-dimensional device than the Anchorwatch.

I think this has more potential. Why:

  • I’m using bone conduction, which directly links the device to our body, giving a much more obvious feedback.
  • The device feeds more information (hour, minutes, seconds), and potentially has the same effect as Anchorwatch where the user can learn intervals, except the user gets more out of this with more information.

For now this is enough to convince me that this will be a better idea. I will use what prof mentioned about finding relevant research to back this up.

Some things I have to think about during my experimentation:

  • What does hour, minutes, seconds sound like? How do I make this sound natural to us, with almost 0 disruption?
    • Seconds should sound short, clicky. 60 different tones for 60 seconds.
    • Minutes should be more balanced, a subtle tone in the middle range. 60 tones that gradually change as the minutes pass.
    • Hours should sound long, humming, and always in the background.
  • What tech should I use? How do I build it? I need a proper blueprint and I need to look for the right components.

I should also consider the appearance. For quick reference, here’s the look of the device I’m going for:

Image taken from https://www.claytonbartonartist.com/?_escaped_fragment_=sci-fi-soldier-helmets.jpg%2Fzoom%2Fcx2j%2Fimage1qky
Image taken from https://www.designboom.com/technology/hiroto-ikeuchi-cyberpunk-wearable-technology-04-02-2018/?utm_source=designboom+daily&utm_medium=email&utm_campaign=surreal,+yet+functional
Image taken from https://www.deviantart.com/edsfox/art/White-fox-sketch-560197326
Image taken from https://twitter.com/archillect/status/695210207601893377/photo/1
Image taken from https://www.artstation.com/artwork/qNBgL

Other things…

I don’t have anything else that is super concrete yet. I’ll spend more time this week to work on it. :’)

3-Week Plan Ahead

  • Week 11
    • First thing I’ll do this coming weekend is to reorganise all the accumulated research and consultations to make sense of everything. So when Monday comes I can immediately start refreshed. I’ll also update my FYP page (timeline, keywords, concepts, etc) for my own good.
    • Start getting hands dirty again with Sound Watch prototype, including trying to get the appearance to look like a proper product. I’ll be using buzzers first instead of bone conduction to get the feel of the sound first.
    • Research on the science that apply to Sound Watch and use them when describing my project next week.
    • Consult prof when I’m done (probably Friday)
  • Week 12
    • Start working on Weather Sensograph (or Third-I or Money Sense)
    • Going through the same process: do the appropriate research and not get overwhelmed by information
    • Start working on presentation slides for interim presentation
    • Consult prof when I’m done.
  • Week 13 (Friday 13th presentation)
    • Start working on 3rd prototype (doesn’t have to finish)
    • Finish all slides
    • Rethink about concept and progress to ensure things are sound and for clarity

Material Research, Updates to concept & Moodboard

I’ve read up on materials to wear on hot and dry weather, which is like the desert. Cotton, Linen and Rayon works well for my concept. Polyester is also a possible choice (conceptually) as it is breathable, although not recommended for “summer weather”.

Looking into nomad dressing, I was also curious why they wear layers and layers of fabric. Turns out, they do this to prevent water-loss. They use light fabric like cotton, that are lighter in colour so as to reflect the sun rays. The layers help them to keep moisture to their body, slowing down the rate of evaporation. Also, covering the body as much as they can helps a lot in keeping cool.


For concept…

I need to type to think, so let me just write freely about my ideas.

New narrative to justify my new style:

In the future, with desertification, cities and citizens adapt. Citizens now dress to protect themselves from the elements while in the city. This makes it necessary for them to have lighting on themselves to self-express and to also allow themselves to be seen. Reflective and holographic clothings makes people stand out more. The need for sensors to help them sense their environment is important as they have to take shelter if it gets too hot or too dusty. There is a need to collect water from the dry air, so some have adaptations that collects moisture from the air as they move. ( https://www.sciencemag.org/news/2019/11/could-desert-beetle-help-humans-harvest-water-thin-air )

I need to start sketching now cos I keep changing my idea. But at least I’m happy that I’m moving away from the clunky sci-fi ripoff I’ve been doing. I’m so glad that we did the biomimicry exercise!

So here’s my moodboard, again. Updated. Finalised. I’m not gonna update anymore or else I’ll be super stuck.

Jumpsuit — The idea is to have a full-body cover that can protect one from the elements. It has to be light and airy. I’ll wire it up with conductive thread to connect the different pieces together. I’m also making it greyscale so I can focus on the material rather than the colours.

The reason why I want to create different pieces is because I always thought about the feasibility with wearable electronics. If I can’t wash my clothes, then what’s the point of wearing them? Also, I want to keep this jumpsuit after the mod so I can use it hehehehehe so yep.

The respirator will be an organic armour-like piece made with plastic-like material (probably). The reason for plastic is for it to be clean as it’s attached to a mask.

The envirosensor wasn’t much looked into previously as I thought it was going to be just an armguard thing. But I’m very inspired by the Warframe character armour on the left and thought about beetles (and also Golden Experience from JoJo :P). I kept looking and though that it will go well with the airbag looking thing, which I plan on using for the interface.

The illuminator will now be more organic and flowy. Kinda origami-like also. I want to make it blend well with the garment, not stand out too much. It should look like it’s part of the garment.

The armour is a last-minute add-on, which I think I won’t have time to do, but Ill still put it here. It’s basically a geometric, plated outer covering for the jumpsuit. Was thinking the tech for this will be for temperature : if it’s too warm, it will open up. But it’s too much to handle for me so I’ll probably not touch it.

I was previously confused with this. I wanted my garment to have holographic-like properties, but I don’t know how to apply them. After looking at beetles, I realised their wings look like this material. So I think I will have this material as a cape. I will also try to apply this biomimicry x tech idea where you can add bumps to the surface of a hydrophobic material, and then coat the tip with a hydrophilic material to make it able to collect dew from fog. This works for my desert concept as people need water.

OKAY. So for now, enough with research. I have enough materials to justify my project, and I need to spend more time moving forward. Tomorrow and the next few days, I’ll go out to look at materials and prototype. I’ll also sketch more throughout the days.

AH UPDATE: ACTUALLY MAY NEED TO THINK ABOUT HAT AND EYEWEAR BUT SEE HOW……… :’)

Sources:

Beetles:

https://www.theguardian.com/sustainable-business/2016/apr/10/environmental-engineering-biomimicry-tapping-into-nature-business-water-soil-drought

https://www.sciencemag.org/news/2019/11/could-desert-beetle-help-humans-harvest-water-thin-air

https://www.sciencedaily.com/releases/2016/02/160224150731.htm

https://www.desertmuseum.org/books/nhsd_beetles.php

https://permaculturenews.org/2017/02/28/harvesting-water-inspired-darkling-beetle/

Collecting fog:

https://edition.cnn.com/2016/11/18/africa/fog-catchers-morocco/index.html

https://www.theverge.com/2018/6/8/17441496/fog-harvesting-water-scarcity-environment-crisis

https://en.wikipedia.org/wiki/Fog_collection

Desertwear:

Desert Travel and Survival

Clothings:

https://www.mariefranceasia.com/health/healthy-practices/wellbeing-tips/6-cooling-fabrics-hot-days-248941.html#item=1

Breathable Fabrics to Wear in Hot Weather

 

 

Noice, Metaverse!

Noise Metaverse was a pretty good experience for me and it was really interesting to see interactive artists tackling different types of project that is both cool and well presented in terms of the aesthetics and its subject matter. The artworks have mostly succeeded in allowing their audience to immerse in the artwork, even if it is just for a moment.

What defines immersiveness? My take would be on how it allows the participant to be consciously and subconsciously engaged with the work. An immersive work should be intuitive, where one do not need to overthink the process. It has to be direct, what we see should be what we get, so it is more familiar to our own physical reality. I would say there is also emotional immersion where we spend energy to invest our deeper feelings into an artwork. In this post I will talk about a physically immersive work and an emotionally immersive work.

  1. Dreamscape by Zenith Chan

Dreamscape is a virtual reality world that allows the audience to look around, walk around to explore, and listen to the ambience. This is really immersive as it engaged in a person’s two main senses — sight and hearing. By isolating these 2 senses in the virtual reality world, one is able to feel like one is within the world, immersed and ‘plugged in’ to this new world. The visual cues also pulls the participant in, by animating some parts of the world while making it mostly static, one is able to perceive the tiny movements, which heightened our awareness. When I was wearing the VR headset and looking around, I kept asking “is that thing gonna move?” I feel that the longer we use the VR headset, the more we are able to feel connected to the virtual space, allowing us to further immerse ourselves. Honestly, I would stay in this virtual world for as long as I would to explore the different corners of the world. Unfortunately, there is a time constraint as I had to move on, and the wires also restricted me in exploring the world.

After learning that the artist hadĀ hand painted the entire world using the VR painting tool, I was deeply impressed by the artist’s perseverance and skill in painting something of such large scale to such detail.

To me, virtual reality is potentially the most immersive medium one can explore (currently, who knows what will happen in the future?). It is not just the amount of 3D space one can explore in a virtual world, but also how one can interact with the environment using certain controllers.

2. Time Space (If Things Could Change) by Akai Chew

A clock with a mirror as its base, connected to a shredder, illuminated by a single spotlight. This installation draw viewers attention despite standing motionless amongst all the visual stimuli all around the exhibition. The visitors are encouraged to pick up the paper, write something they wish to forget, and shred it with the shredder. The outcome: The clock winds back as the paper gets shredded, which symbolises not just the removal of the unwanted memory, but also the unwinding of time. This means the complete removal of the event, metaphorically and emotionally, which can relieve some participant’s pain despite the fact that the event is simply not erased at all.

The shredded paper becomes part of the artwork, lying as a cluster on the floor, filled with other unwanted memories. I would imagine that the shredded paper will be thrown out eventually, symbolising the cleansing of the events, but at the same time also rendering the shredded unwanted memories intangible. Does this means that the cleansing of the memory is only temporary? Despite this, I think that the clock, to me, symbolises the collective time rewound by all the visitors, which meant that the rewind on the clock is permanent. The relationship between the intangible (rewind of clock) and the tangible (shredded paper) is what’s unsettling for me.

In terms of immersion, I find myself not visually or physically immersed but rather, emotionally. The idea of putting in a memory I want to forget draws be back into an unhappy time. The thought process of putting the memory down on paper made me think and rethink over and over again the series of events and the decision on whether to ‘shred’ it or not. By putting the event or name down, I put in a lot of my past and invest myself into the artwork, which turns out rewarding looking at the paper shred and the clock rewind. At that moment, I understood the artwork, and I felt relieved for that moment, for after that moment I know that, even for only a few hours, I would have that memory gone, for I have put in the effort to removed it from my brain.

Honorable mentions:

  1. Aistronaut by Jake Tan Jun Kang
  2. Void Deck Whispers by Tay Yu Xian

Programming Final – Strang

Our Research:

So we wanted to work on something like a clock. After Yue Ling and I looking at some online examples and ideas, we decided to settle on an infinity mirror with LED lights that changes as the user interacts with the clock.

As clock measures time, we wanted to play with the idea of a clock not measuring what it’s supposed to measure. So by changing the speed and colour of the clock through interaction, it will create an inaccurate idea of time, and as such, becomes an interesting observation of our relationship with interaction and time.

Our Proposal:

Our Project

Yue Ling and I went to Sim Lim and Art Friend to purchase the items, and after getting some help from Charissa, Jia Yi, Nok Wan, Mark, and Vienna, we managed to get the items we need!

Assembly

Fixing up the main structure of the clock

Applying of black one-way mirror film

After assembly, we put in the LED

The Prototype Code:

Yue Ling found an example online:

Guide for WS2812B Addressable RGB LED Strip with Arduino

With this example, we were able to use a timer to change the LED states, and as we are unable to find an elegant way of coding the colours to change smoothly, we have to create different set of colours and code different timings to different set of colours:

#include <FastLED.h>

#define LED_PIN 13
#define NUM_LEDS 51
#define BRIGHTNESS 200
#define LED_TYPE WS2811
#define COLOR_ORDER GRB
CRGB leds[NUM_LEDS];

#define UPDATES_PER_SECOND 100

CRGBPalette16 currentPalette;
TBlendType currentBlending;

extern CRGBPalette16 myRedWhiteBluePalette;
extern const TProgmemPalette16 myRedW4hiteBluePalette_p PROGMEM;

int x = 1000;
int xspeed = 100;
int p1, p2, p3;
int o1, o2, o3;

void setup() {
delay( 3000 ); // power-up safety delay
FastLED.addLeds<LED_TYPE, LED_PIN, COLOR_ORDER>(leds, NUM_LEDS).setCorrection( TypicalLEDStrip );
FastLED.setBrightness( BRIGHTNESS );

// currentPalette = RainbowColors_p;
currentBlending = LINEARBLEND;
}

void loop()
{

ChangePalettePeriodically();

static uint8_t startIndex = 0;
startIndex = startIndex + 1;

FillLEDsFromPaletteColors( startIndex);

FastLED.show();
FastLED.delay(x / UPDATES_PER_SECOND);
}

void FillLEDsFromPaletteColors( uint8_t colorIndex)
{
uint8_t brightness = 255;

for ( int i = 0; i < NUM_LEDS; i++) {
leds[i] = ColorFromPalette( currentPalette, colorIndex, brightness, currentBlending);
colorIndex += 3;
}
}

void ChangePalettePeriodically()
{
uint8_t secondHand = (millis() / 1000) % 21;
static uint8_t lastSecond = 99;

if ( lastSecond != secondHand) {
lastSecond = secondHand;
if ( secondHand == 0) {
SetupPurpleAndGreenPalette();
currentBlending = LINEARBLEND;
}
if ( secondHand == 1) {
SetupPurpleAndGreenPalette();
currentBlending = LINEARBLEND;
}
if ( secondHand == 2) {
SetupPurpleAndGreenPalette2();
currentBlending = LINEARBLEND;
}
if ( secondHand == 3) {
SetupPurpleAndGreenPalette3();
currentBlending = LINEARBLEND;
}
if ( secondHand == 4) {
SetupPurpleAndGreenPalette4();
currentBlending = LINEARBLEND;
}
if ( secondHand == 5) {
SetupPurpleAndGreenPalette5();
currentBlending = LINEARBLEND;
}
if ( secondHand == 6) {
SetupPurpleAndGreenPalette6();
currentBlending = LINEARBLEND;
}
if ( secondHand == 7) {
SetupPurpleAndGreenPalette7();
currentBlending = LINEARBLEND;
}
if ( secondHand == 8) {
SetupPurpleAndGreenPalette8();
currentBlending = LINEARBLEND;
}
if ( secondHand == 9) {
SetupPurpleAndGreenPalette9();
currentBlending = LINEARBLEND;
}
if ( secondHand == 10) {
SetupPurpleAndGreenPalette10();
currentBlending = LINEARBLEND;
}
if ( secondHand == 11) {
SetupPurpleAndGreenPalette11();
currentBlending = LINEARBLEND;
}
if ( secondHand == 12) {
SetupPurpleAndGreenPalette10();
currentBlending = LINEARBLEND;
}
if ( secondHand == 13) {
SetupPurpleAndGreenPalette9();
currentBlending = LINEARBLEND;
}
if ( secondHand == 14) {
SetupPurpleAndGreenPalette8();
currentBlending = LINEARBLEND;
}
if ( secondHand == 15) {
SetupPurpleAndGreenPalette7();
currentBlending = LINEARBLEND;
}
if ( secondHand == 16) {
SetupPurpleAndGreenPalette6();
currentBlending = LINEARBLEND;
}
if ( secondHand == 17) {
SetupPurpleAndGreenPalette5();
currentBlending = LINEARBLEND;
}
if ( secondHand == 18) {
SetupPurpleAndGreenPalette4();
currentBlending = LINEARBLEND;
}
if ( secondHand == 19) {
SetupPurpleAndGreenPalette3();
currentBlending = LINEARBLEND;
}
if ( secondHand == 20) {
SetupPurpleAndGreenPalette2();
currentBlending = LINEARBLEND;
}
if ( secondHand == 21) {
SetupPurpleAndGreenPalette();
currentBlending = LINEARBLEND;
}
}
}

void SetupPurpleAndGreenPalette()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
pink, pink, orange, orange,
orange, orange, pink, pink,
pink, pink, orange, orange,
orange, orange, pink, pink );
}

void SetupPurpleAndGreenPalette2()
{

CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
pink2, pink2, orange2, orange2,
orange2, orange2, pink2, pink2,
pink2, pink2, orange2, orange2,
orange2, orange2, pink2, pink2 );
}

void SetupPurpleAndGreenPalette3()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
pink3, pink3, orange3, orange3,
orange3, orange3, pink3, pink3,
pink3, pink3, orange3, orange3,
orange3, orange3, pink3, pink3 );
}
void SetupPurpleAndGreenPalette4()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
pink4, pink4, orange4, orange4,
orange4, orange4, pink4, pink4,
pink4, pink4, orange4, orange4,
orange4, orange4, pink4, pink4 );
}

void SetupPurpleAndGreenPalette5()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple1, purple1, orange5, orange5,
orange5, orange5, purple1, purple1,
purple1, purple1, orange5, orange5,
orange5, orange5, purple1, purple1 );

}

void SetupPurpleAndGreenPalette6()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple2, purple2, orange6, orange6,
orange6, orange6, purple2, purple2,
purple2, purple2, orange6, orange6,
orange6, orange6, purple2, purple2 );

}

void SetupPurpleAndGreenPalette7()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple3, purple3, orange7, orange7,
orange7, orange7, purple3, purple3,
purple3, purple3, orange7, orange7,
orange7, orange7, purple3, purple3 );

}

void SetupPurpleAndGreenPalette8()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple4, purple4, orange8, orange8,
orange8, orange8, purple4, purple4,
purple4, purple4, orange8, orange8,
orange8, orange8, purple4, purple4 );

}

void SetupPurpleAndGreenPalette9()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple5, purple5, orange9, orange9,
orange9, orange9, purple5, purple5,
purple5, purple5, orange9, orange9,
orange9, orange9, purple5, purple5);
}

void SetupPurpleAndGreenPalette10()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple6, purple6, orange10, orange10,
orange10, orange10, purple6, purple6,
purple6, purple6, orange10, orange10,
orange10, orange10, purple6, purple6);
}

void SetupPurpleAndGreenPalette11()
{
CRGB pink = CRGB( 250, 0, 100);
CRGB pink2 = CRGB( 225, 0, 100);
CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
purple7, purple7, orange11, orange11,
orange11, orange11, purple7, purple7,
purple7, purple7, orange11, orange11,
orange11, orange11, purple7, purple7);
}

As you can see, we created combinations of different colours manually, which is not practical. We hope to improve on this.

Prototype outcome:

So I tried to fix everything up and here is the result:

Ultrasound

So Brendan worked on the ultrasonic sensors after looking through some codes online:

https://create.arduino.cc/projecthub/unexpectedmaker/ultrasoniceyes-b9fd38?ref=tag&ref_id=ultrasonic&offset=2

https://www.hackster.io/gowrisomanath/the-positronic-brain-my-techno-heart-9084be

He made a code that detects distance based on how far an object is in CM

const int trigPin = 9;
const int echoPin = 10;
const int ledPin = 13;

// defines variables
long duration;//travel time
int distance;//travel distance
int safetyDistance;

void setup() {
pinMode(trigPin, OUTPUT); // sets the trigPin as an Output
pinMode(echoPin, INPUT); // sets the echoPin as an Input
pinMode(ledPin, OUTPUT);
Serial.begin(9600); // Starts the serial communication for serial monitor
}
void loop() {
// Check trigPin
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
//generate ltrasound wave at high state
// Sets the trigPin on HIGH state for 10 micro seconds
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);

// reads the echoPin, read the sound wave travel time
duration = pulseIn(echoPin, HIGH);
// Calculating the distance
distance= duration*0.034/2;
//blinker
safetyDistance = distance;
if (safetyDistance <= 20) {
digitalWrite(ledPin, HIGH);
}
else{
digitalWrite(ledPin, LOW);
}
Serial.print("Distance: ");
Serial.println(distance);

}

With this code, we merged the lights code with the ultrasonic sensor code and simplified it to be slightly more elegant:

const int trigPin = 9;
const int echoPin = 10;
const int ledPin = 13;
const int flexPin = A0; //pin A0 to read analog input

// defines variables
long duration;//travel time
int distance;//travel distance
int safetyDistance;
int value; //save analog value

#include <FastLED.h>

#define LED_PIN 13
#define NUM_LEDS 51
#define BRIGHTNESS 200
#define LED_TYPE WS2811
#define COLOR_ORDER GRB
CRGB leds[NUM_LEDS];

#define UPDATES_PER_SECOND 100

CRGBPalette16 currentPalette;
TBlendType currentBlending;

extern CRGBPalette16 myRedWhiteBluePalette;
extern const TProgmemPalette16 myRedWhiteBluePalette_p PROGMEM;

int x;
int a, b, c;
int d, e, f;
int col1, col2;

void setup() {
pinMode(trigPin, OUTPUT); // sets the trigPin as an Output
pinMode(echoPin, INPUT); // sets the echoPin as an Input
pinMode(ledPin, OUTPUT);
Serial.begin(9600); // Starts the serial communication for serial monitor

FastLED.addLeds<LED_TYPE, LED_PIN, COLOR_ORDER>(leds, NUM_LEDS).setCorrection( TypicalLEDStrip );
FastLED.setBrightness( BRIGHTNESS );

// currentPalette = RainbowColors_p;
currentBlending = LINEARBLEND;

}

void loop() {

value = analogRead(flexPin); //Read and save analog value from potentiometer
Serial.println(value); //Print value
value = map(value, 700, 900, 0, 255);//Map value 0-1023 to 0-255 (PWM)

// Check trigPin
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
//generate ltrasound wave at high state
// Sets the trigPin on HIGH state for 10 micro seconds
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);

// reads the echoPin, read the sound wave travel time
duration = pulseIn(echoPin, HIGH);
// Calculating the distance
distance= duration*0.034/2;
//blinker
safetyDistance = distance;

Serial.print("Distance: ");
Serial.println(distance);

ChangePalettePeriodically();
static uint8_t startIndex = 0;
startIndex = startIndex + 1;

FillLEDsFromPaletteColors( startIndex);

FastLED.show();
FastLED.delay(x / UPDATES_PER_SECOND);
}

void FillLEDsFromPaletteColors( uint8_t colorIndex)
{
uint8_t brightness = 255;

for ( int i = 0; i < NUM_LEDS; i++) {
leds[i] = ColorFromPalette( currentPalette, colorIndex, brightness, currentBlending);
colorIndex += 3;
}
}

void ChangePalettePeriodically()
{
// uint8_t secondHand = (millis() / 1000) % 21;
// static uint8_t lastSecond = 99;

// if ( lastSecond != secondHand) {
// lastSecond = secondHand;
SetupPurpleAndGreenPalette();
currentBlending = LINEARBLEND;
if (safetyDistance >= 30) {
x=500;
a = 250;
b = 0;
c = 100;
d = 250;
e = 100;
f = 50;
}
if (safetyDistance <=29) {
x=800;
a = 225;
b = 0;
c = 100;
d = 200;
e = 100;
f = 50;
}
if (safetyDistance <=26) {
x=1200;
a = 200;
b = 0;
c = 100;
d = 150;
e = 100;
f = 50;
}
if (safetyDistance <=23) {
x=1500;
a = 175;
b = 0;
c = 100;
d = 100;
e = 100;
f = 50;
}
if (safetyDistance <=20) {
x=1800;
a = 150;
b = 0;
c = 100;
d = 50;
e = 100;
f = 50;
}
if (safetyDistance <=17) {
x=2200;
a = 125;
b = 0;
c = 100;
d = 0;
e = 100;
f = 50;
}
if (safetyDistance <=14) {
x=2500;
a = 75;
b = 0;
c = 100;
d = 0;
e = 100;
f = 50;
}
if (safetyDistance <=11) {
x=2800;
a = 75;
b = 0;
c = 125;
d = 0;
e = 100;
f = 100;
}
if (safetyDistance <=8) {
x=3200;
a = 75;
b = 0;
c = 150;
d = 0;
e = 100;
f = 150;
}
if (safetyDistance <=5) {
x=3500;
a = 75;
b = 25;
c = 150;
d = 0;
e = 100;
f = 200;
}
if (safetyDistance <=2) {
x=3800;
a = 75;
b = 50;
c = 150;
d = 0;
e = 100;
f = 250;
}
}

void SetupPurpleAndGreenPalette()
{
CRGB col1 = CRGB( a, b, c);
CRGB col2 = CRGB( d, e, f);

CRGB pink3 = CRGB( 200, 0, 100);
CRGB pink4 = CRGB( 175, 0, 100);
CRGB purple1 = CRGB( 150, 0, 100);
CRGB purple2 = CRGB( 125, 0, 100);
CRGB purple3 = CRGB( 75, 0, 100);
CRGB purple4 = CRGB( 75, 0, 125);
CRGB purple5 = CRGB( 75, 0, 150);
CRGB purple6 = CRGB( 75, 25, 150);
CRGB purple7 = CRGB( 75, 50, 150);
CRGB orange = CRGB( 250, 100, 50);
CRGB orange2 = CRGB(200, 100, 50);
CRGB orange3 = CRGB(150, 100, 50);
CRGB orange4 = CRGB(100, 100, 50);
CRGB orange5 = CRGB(50, 100, 50);
CRGB orange6 = CRGB(0, 100, 50);
CRGB orange7 = CRGB(0, 100, 100);
CRGB orange8 = CRGB(0, 100, 150);
CRGB orange9 = CRGB(0, 100, 175);
CRGB orange10 = CRGB(0, 100, 200);
CRGB orange11 = CRGB(0, 100, 250);
CRGB yellow = CRGB(255, 255, 0);
CRGB green = CRGB(100, 255, 0);
CRGB turquoise = CRGB(0, 255, 100);
CRGB grue = CRGB( 0, 100, 255);
CRGB purple = CRGB( 255, 100, 255);
CRGB black = CRGB::Black;
currentPalette = CRGBPalette16(
col1, col1, col2, col2,
col2, col2, col1, col1,
col1, col1, col2, col2,
col1, col1, col2, col2);
}

Motors + Flex

Brendan went back to fix the motors up with the second part of our project which is to make the screen move. We also went to the film store to borrow some flex sensors.

It was really difficult to connect everything as we have so many wires that are too loose. Eventually, we managed to put everything up and taped the wires down.

Yue Ling found a code for the motors and we merged it with a flex sensor code such that the motor will move directly proportionate to the flex’s bend.

/* Sweep
by BARRAGAN <http://barraganstudio.com>
This example code is in the public domain.

modified 8 Nov 2013
by Scott Fitzgerald
http://www.arduino.cc/en/Tutorial/Sweep
*/

#include <Servo.h>

const int flexPin = A0; //pin A0 to read analog input
int value; //save analog value

Servo myservo; // create servo object to control a servo
// twelve servo objects can be created on most boards

int pos = 0; // variable to store the servo position

void setup() {
Serial.begin(9600); //Begin serial communication
myservo.attach(9); // attaches the servo on pin 9 to the servo object
}

void loop() {

value = analogRead(flexPin); //Read and save analog value from potentiometer
Serial.println(value); //Print value
value = map(value, 400, 500, 0, 180);//Map value 0-1023 to 0-255 (PWM)

//if(value < 399){

myservo.write(value);
// pos = 0;
// pos--;
// delay(1);
}

//if(value> 400){bb
// value = map(value, 700, 900, 0, 255);//Map value 0-1023 to 0-255 (PWM)

//pos+=3;
// myservo.write(pos); // tell servo to go to position in variable 'pos'
// delay(1); // waits 15ms for the servo to reach the position
// }
//}

So with both parts complete, we fixed up the box. The wiring was a major headache for us as we do not have the correct kit and enough wires. We managed to fix everything after much troubles. Yue Ling also borrowed female-to-male wires from Vienna which is a great help! Without that, we would not be able to connect our flex sensors and our ultrasonic sensors.

Unfortunately we don’t have an image of our final wirings.

Final Model

Before placing hand inside (lights)
Before placing hand inside (motor)
After placing hand inside (lights)
After placing hand inside (motor)

Video demonstration:

 

Concept is that by putting your hand through the hole, we bend space and time. Bending time is through slowing down of the rotation of the ‘clock’ as well as turning the colours from pink/orange to purple/blue, which began from the idea of ‘blue shifting’ like how light will look like if time slows down.

By pushing the fabric at the end of the installation, the motors move, emulating fingers that seems like as if the user’s hands have traversed through a wormhole into another location.

The overall feeling we aimed to get is the sense of amazement from the shifting in colours and also being able to teleport their hands through space.

Presentation Slides:

https://docs.google.com/presentation/d/1QqwL3OJHkvuKO2NJf1szHjSbgNvObOalTWKNzutuSV4/edit?usp=sharing


Problems Encountered

  1. LED lights code settings could be better, as we don’t know how to make it smoothly transit between colours. We could possibly make the value from the distance detected by the ultrasonic sensors directly control the colour range, but we don’t know how to do that so that can be an area of improvement.
  2. Circuit is a very big mess for us and we don’t have the right kits provided so we have to keep sourcing from our friends. We could have bought a long roll of wire so we can have all the wire we need and not use tiny short wires which breaks the circuit easily.
  3. We bought some wrong items along the way too, like the acrylic piece that is in front of the clock, we accidentally bought a matt acrylic. We had some difficulty finding the correct LED too and it was thanks to Vienna who helped us with where we can get it.
  4. Ultrasonic Sensor is buggy and there is no solution to fixing it. We speculate that it was because of bad wiring that it kept reading weird readings once in a while.
  5. The ‘fingers’ stuck on the motor wasn’t really sticking well and if we had more time, we would have improved it by securing it better.

What I learnt:

I learnt to control the motor using the flex sensor values, as well as how to work around a problem (the LED one) by manually writing individual code for individual colours. I’ve also learnt to use 4 different components and making them work together.

Tasks Deligation

Brendan:

  • Ultrasonic Sensors code + circuit
  • Motors setup
  • Fabric board assembly

Bryan:

  • LED code + circuit
  • motor circuit
  • general assembly and troubleshooting

Yue Ling:

  • Item sourcing
  • LED code + testing
  • Motors code + circuit

Guit4r M4st3r

This is my game, Guit4r M4st3r. Here is the opening page!

Upon pressing spacebar, you’ll move to the instructions page, which guides you through the gameplay

Pressing the spacebar through the instructions page will lead you to the game play, which starts at a countdown:

 

 

And then the game begins with some lit metal music

 

The player has to move QWERTY and ASDFGH to control the fingers on the string. The top row being the keys to move the respective finger up the string, the bottom row to move down. After some getting use to, the player will be able to master the technique to control.

The player have to match the “fingers” (yellow circles) with the faded yellow circles and press spacebar at the cyan part of the bottom bar to score points.

 

 

Otherwise, you get an”Uh-Oh’ and lose a health!

The overall gameplay will be somewhat like AuditionSEA, but with more buttons to press and only one speed (p.s. the speed increases over time from 1.0 to 1.6 so as to add difficulty)

Once lose, you will get a gameover page, which you can see your final score, and if you press spacebar, you will restart the game

The whole idea of the game is inspired by difficult arcade games like Flappy Bird. It is meant to be simple, though hard to play, and frustrating, but addictive. Like Flappy Bird, once the player masters the technique, the game becomes so easy that getting by is no longer the issue, and scoring the high score becomes a chore.

The aesthetics of the game is quite close to Flappy Bird’s, and even the font is from the game. I do that to draw the similarities between both arcade games.


 

I encountered some problems during the process of making the game. Most of them are solved, and one of them is a major one that involves me restarting the game.

Basically, the whole code would not reset my array that makes the program say “READY, SET, GO!” when i restart the game after dying. After a long time of figuring out, I finally understood why. I just needed to move a bunch of code to a different place and it worked.

Other minor problems are just me jumbling up my gamemode values which took me a long time to fix, although it is a simple problem.

One major problem that still exists now that I cannot fix is that, if I were to restart the game after gameover, the game will instantly cause an “Uh-oh” and lose a health. No matter what you do.


So I have reapplied what I learnt this term onto this game. One major take away is something that I discovered on my own (and i’m proud of it!). It is that we can control the game base on variables that are not visible on the screen. Things like ‘gamemodes’, ‘counters’, and ‘timers’ which I used extensively to make everything work with as little mess as possible. 1 value can control everything.

Beginning of History of Design

The beginning of history of design starts when cavemen started inventing tools. The first greatest invention was the Orduvai Hand Axe made 15 million years ago. I learnt this from poly, totally not a nerd. šŸ™‚

The Orduvai Handaxe is the first greatest invention made by Man. Image taken from https://britishmuseum.withgoogle.com/object/olduvai-handaxe

https://britishmuseum.withgoogle.com/object/olduvai-handaxe

Forrest Gump – Process, Research, and Final

Quotes

Image taken from http://www.nerdophiles.com/2014/03/06/discussing-dreams-and-reality-with-the-wind-rises/

Airplanes are beautiful, cursed dreams, waiting for the sky to swallow them up. – The Wind Rises

She was beautiful, just like the wind. – The Wind Rises

Every engineer has his 10 years in the sun – The Wind Rises

Image taken from https://shmlive.wordpress.com/tag/2001a-space-odyssey/

I am afraid I can’t do that Dave. – 2001: A Space Odyssey

I’m afraid. I’m afraid, Dave. Dave, my mind is going. I can feel it. I can feel it. My mind is going. There is no question about it. I can feel it. I can feel it. I can feel it. I’m a… fraid. – 2001: A Space Odyssey

Image taken from http://whatculture.com/film/what-does-the-ending-of-the-prestige-really-mean

Which knot did you tie, Borden? – The Prestige

That’s why every magic trick has a third act, the hardest part, the part we call “The Prestige” – The Prestige

Image taken from https://inktoimage.wordpress.com/2015/04/22/the-perks-of-being-a-wallflower-editing/

So, I guess we are who we are for alot of reasons. And maybe we’ll never know most of them. But even if we don’t have the power to choose where we come from, we can still choose where we go from there. We can still do things. And we can try to feel okay about them. – Perks Of Being A Wallflower

Right now we are alive and in this moment. I swear we are infinite. – Perks Of Being A Wallflower

Image taken from http://testingtheglobe.com/trumanshow.html

We accept the reality of the world with which we’re presented. It’s as simple as that. – Truman Show

Image taken from http://www.foxmovies.com/movies/the-martian

In your face, Neil Armstrong! – The Martian

I’m going to have to science the shit out of this. – The Martian

Image taken from https://wall.alphacoders.com/tags.php?tid=57303

Let’s write our names on each other’s palms – Kimi no na wa

Please make me a handsome Tokyo boy in my next life! – Kimi no na wa

Image taken from http://www.rogerebert.com/reviews/arrival-2016

Despite knowing the journey… and where it leads… I embrace it… and I welcome every moment of it. – Arrival

“Language is the foundation of civilization. It is the glue that holds a people together. It is the first weapon drawn in a conflict.” – Arrival

Abbott is death process. – Arrival

Coffee with some aliens… – Arrival

Image taken from https://cinemashock.org/2012/06/16/visual-storytelling-in-shutter-island/

I love this thing because you gave it to me. But the truth is… it is one fuckin’ ugly tie! – Shutter Island

Which is worse, to live as a monster or to die as a good man? – Shutter Island


SELECTED QUOTES

Those aren’t mountains, they’re waves. – Interstellar, 2015

https://www.youtube.com/watch?v=4Hf_XkgE1d0 01:39Ā 

You see things. You keep quiet about them. And you understand. You’re a wallflower. – Perks Of Being A Wallflower, 2002

Mars will come to fear my botany powers. – The Martian, 2015

https://www.youtube.com/watch?v=5KsVojEaoms 00:34

Good morning, and in case I don’t see ya, good afternoon, good evening, and good night! – Truman Show

https://www.youtube.com/watch?v=-_zYn-HHcyA 03:57

Abbott is death process – Arrival, 2016

https://www.youtube.com/watch?v=-o0MwXqS0Q4&t=558s 08:59


Ā Alright, with these quotes, let’s break them down!

1. Mountains, waves, doubt, fear, reaction.

  • Mountains: Rock, giant, wall,
  • Waves: Ripples, fishes
  • Fear: Constricted eye

2. Sight, quiet, understand, wallflower.

  • Sight: Eye, bird, light, camera (without button),
  • Quiet: Lips closed or covered, water,

3. Fear, botany, powers, mars, plant.

  • Fear: retreating action, constricted eyes,
  • Botany: Plant parts / whole plant, soil, spade
  • Powers: Electricity, muscles,
  • Mars: Ares, chocolate bar, rock, boy, curiosity

4. Morning, speculate, afternoon, evening, night.

  • Morning: Breakfast, sunrise, 8am, people going one direction, (transport)
  • Afternoon: Lunch, sun high, 1pm, people at work (work)
  • Evening: Dinner, sunset, 6pm, people going the other direction (chilling)
  • Night: Bed, moon, 10pm, people at home (home)

5. Abbott, alien, heptapod, death, process.

  • Abbott (alien): alien crab, octopus,
  • death: corpse, skull,
  • process: duration, 4th dimension,

Compositions and detailed explanations

First composition:Ā Mars will come to fear my botany powers. – The Martian

I’ve gotten an idea from the quote ‘Men are from Mars, women are from Venus’. Upon further research, I realised tha the male symbol is the same as the Mars symbol:

Image taken from http://downloadicons.net/male-symbol-icons

So, I used a man to represent Mars.

Image taken from https://openclipart.org/detail/220390/terrified-man

 

Next, botany is the study of plants, so plant designs will suffice. I used an alien gun to represent the main character in the movie, who is a botanist stranded in Mars. I like the idea of using a man to represent Mars, and an ‘alien’ to represent the human, who is really an alien to Mars. By using the alien gun shooting plant parts to ‘Mars’, I want to create the idea of an invasion, man invading Mars with botany.

I further enhance the idea of an invasion by spreading electricity in a circular manner, making it look like the powers are spreading across the whole of Mars. I tried to arrange the bolts in a way that makes it look like it is coming from the flowers, instead of the other way round. It works by making the flowers big and having small bolts overlapping the flowers, making it seems like it’s coming out from the flowers.

Then, I added an eye, covering the entire face of the man representing Mars, and constrict its pupil to represent fear itself.

Although the meaning is present, the composition looks a bit plain, so I added the symbol itself to enhance the identity of the man being Mars.

And there we go. The first composition, filled with meaning, but lacking in composition.

Here are thumbnails of my work process / variations

Composition 2:Ā Those aren’t mountains, they’re waves. – Interstellar, 2015

Initially, I wanted to use giants or fences to represent mountains, but I ended up using cows, as cows’ bodies have interesting shapes that looks like mountain. Cows are also grounded and strong, so they are good representation of mountains.

Image taken from http://animalia-life.club/other/milk-cow-drawing.html

 

Image taken from http://clipart-library.com/picket-fence-cliparts.html

To represent waves, I use fishes and sea creatures, because they’re not just a part of the ocean, their movements are also like waves, left and right.

Image taken from https://thegraphicsfairy.com/vintage-clip-art-image-wonderful-octopus-or-cuttle-fish/
Image taken from https://www.dreamstime.com/stock-illustration-tarpon-illustration-vector-monochrome-image70158276

At first, I wanted to use an octopus to represent the waves, hiding behind a cow. But it’s very odd. I decided to have a fisherman, because I just like the idea of fishing.

Image taken from https://www.educima.com/dibujo-para-colorear-pescador-i28085.html

I think it adds on to the whimsicality of the whole composition, and it is also very symbolic. In the movie, the characters are like the fishermen, trying to find the source of a GPS signal in a planet filled with water. Their spaceship is like the boat, and they are the fishermen, fishing for the GPS. They looked afar and saw mountains, which, upon closer inspection, are huge tidal waves.

In the composition, I used the sea of cows (strong foreground) to represent the water world, and the fishermen are fishing within the cows. (if you look from far, the cows do look like mountains) But to their surprise, a fish came up, and sea creatures crawled upon the boat, causing havoc, like the waves in the movie. The fisherman, conveniently wearing a space helmet, has eyes that expresses fear, like in the movie.

The composition is designed to bring the quote to another context, while maintaining its meaning. For instance, the mountains are usually on the background, but I’m using it as a foreground in this case. This also creates a visual hierarchy, fish, fisherman, cow.

I applied rule of thirds to the fish, making sure that the focus is the fish, even though there are other elements fighting for attention. I also arranged the elements around to lead the eyes from the fish to the fisherman, through the curves and the movements of the tentacles.

So there we go, the most precise composition I did, the one I am most proud of.

Other variations:

Composition 3:Ā You see things. You keep quiet about them. And you understand. You’re a wallflower. – Perks Of Being A Wallflower, 2002

The initial idea is to literally have an eye, mouth, zip on mouth, connected to a brain. The idea is too literal, despite how nice I could imagine it to be. I scrapped the idea completely, and went on with something more in depth.

I liked the idea of having a prism. A prism captures light, then refracts it internally, and then spreads the light out into a rainbow. In a way, it can be a metaphor to the whole seeing and talking and understanding thing.

Image taken from https://www.youtube.com/watch?v=qQu7XVVDulw

I’m inspired by this fanart of Pink Floyd’s album ‘The Dark Side of the Moon’, which showcases a prism spreading light. The prism basically ‘sees’ the light by absorbing it, processing it (understanding) by refracting, and then spreading it out into a rainbow, like when someone talks about something they see, it tends to be a bit different from the original.

For my composition, I’m using different elements to craft the idea of seeing, keeping quiet, and processing. Oh, and the last part, wallflower.

Image taken from https://www.dreamstime.com/royalty-free-stock-image-open-zipper-image25016246
Image taken from https://www.pinterest.com/pin/296674694184501299/?lp=true
Image taken from https://avopix.com/premium-photo/451601695-shutterstock-seamless-pattern-with-black-sexy-lips
Image taken from http://monardo.info/drawing-of-human-brain.aspx

These are some of the images I intended to use for my composition. After some iteration, I decided to drop the zippers and the brain, and add in some dingbats of eyes, locks, and keys, which I feel enhances the overall idea.

The eye enhances the idea of sight. The lock is used to lock away the scattered beam of light, and the keys are behind the prism, suggesting that the keyholder has the power over the light scatter but it doesn’t do so. Finally, a flower is in the background, representing the wallflower.

The overall composition is framed around the prism, which is the representation of the main character in this quote. The rest are just supporting elements. I want to use lock and keys to create some distortion of reality, so as to make the overall composition more interesting, and also break away from any literal meaning in the composition.

Other iterations:

Composition 4:Ā Abbott is death process – Arrival, 2016

This is a tricky one, but I am not daunted! I really like this quote for its sheer simplicity and its meaning is so strong and whole, it’s one of the most impactful quote in the movie. This quote leaves the viewers in the mystery of what it means by ‘death process’, and makes us wonder where did Abbott go in the movie. Did he really die? Why can’t Costello just say that Abbott is dead? Is it because of the lapse in understanding between the alien language and the human’s? Or is it a way of saying ‘death’? (turns out, it’s the last one).

In the movie, the alien don’t see time as we humans do. They see it as a whole, and they know the timeline of themselves and everything they interact with. To them, time is like space, you can always revisit yourself in another time. In short, they live in the fourth dimension, where time is just like another spatial dimension.

Okay, I digress. So these aliens look like octopuses because they are squid like creatures with 7 legs (named heptapods). Initially, I wanted to create a heptapod out of insect limbs, and then remove them along a path, like in my sketch.

Image taken from https://thegraphicsfairy.com/vintage-clip-art-image-wonderful-octopus-or-cuttle-fish/
Image taken from https://www.colourbox.com/vector/rhynchites-beetle-isolated-on-white-vintage-engraving-vector-4869827
Image taken from http://clipart-library.com/starfish-drawing-cliparts.html
Image taken from https://www.pinterest.co.uk/tashunko/graph-co/

Here are some examples of images I used. I was also inspired by the image of evolution:

Image taken from https://steemit.com/science/@herpetologyguy/the-biggest-misunderstandings-about-evolution

However, I try to break away from creating a linear design because I want to amplify the idea of the creature’s perception of time. I want something that is happening all at one time.

This was my first lazy design. It wasn’t that great even though the background is pretty cool. Now, I’m thinking of creating something more surreal.

Image taken from https://free.clipartof.com/details/744-Free-Clipart-Of-A-Hen-And-Rooster
Image taken from https://commons.wikimedia.org/wiki/File:Chick_(PSF).svg
Image taken from https://www.dreamstime.com/royalty-free-stock-photo-empty-eggshell-image13135085

I’m using chicken as the subject. Why? I am inspired by how every part of the chicken, from the unborn to the adult, can be killed for food. I am also inspired by the question ‘which comes first, the chicken or the egg?’.

All these properties of the chicken describes the character in the quote, Abbott. Abbott cannot see time, therefore he is the unhatched egg, the chick, the chicken. I want to show here that every version of Abbott is dying, thus him being in ‘death process’.

I wanted all of Abbott to be in the frame, and all of Abbott to be dying, that’s the idea. Although it isn’t really the same as the exact meaning in the quote (which is that at that moment in ‘time’, Abbott is dying), I feel that this composition holds the meaning of the quote out of context and in its own interpretation.

Anyhow, I use leading lines and rule of thirds to lead the eye to the chicken.

p.s. now come to think of it, it may look funnier to have the chick lie down instead of standing up. I guess I’ll consult first and edit it in the refinement.


Refinements

Composition 1 improved: changed the background to more consistent lightning bolts, to create more density and depth. Changed the flowers from ding bats to an image, which works better as it makes the composition more ‘whole’. Made use of negative space within the lightning bolts to create the effect of an energy ball coming from the gun. Overall give a more powerful feeling, less bland. Satisfied.

This is my final one for print. I only resized some elements slightly and added a black background to make the fish more outstanding, and also dampening the other elements that are fighting for attention.

Unfortunately I have printed the wrong version for silkscreen, so for the test print I will just do the one I printed, and I will redo this during recess week. (cons of having too many versions šŸ™ )

Silkscreen Trial and Process

It was hard for me to decide which composition to use. I grew attached to the right one, but I like the overall composition of the left one!!!

In the end, after asking for different opinions, I chose the one on the right, as it explains more about the movie and has decent composition.

After drying the silkscreen board, coated both sides of the board with emulsion, drying it again before sticking our transparency onto the front of the silk screen board.

Emulsion

Next, we send the silkscreen into the UV light machine thing, which uses UV to harden the parts that aren’t covered by the black ink in our transparency.

After that, we sent the silk screen to wash. Here’s a video of the process:

Shag face aside: Yay! My silkscreen is ready.

It was at this point when I realised that I printed the wrong design, which means I have to redo it on recess week.

Once the edges are taped off, coins are used on all 4 corners of the screen to create a gap between the screen and the medium that is to be used for the print.

The next few shots are taken by Claire, and Tiffany helped with holding the board for me. The MVPs!!!

Aligning

Spreading ink like butter

Here we gooooo

The outcome

A failed attempt

Takeaway from this is, the first print will always be bad, so there is a need to ‘season’ the silkscreen first to allow the ink to be spread more easily on the second attempt. It’s best to nail it on the second or third attempt, because the ink will seep through the board if there’s too much ink (too many attempts).

The squeegee should glide across the board, not pressed down hard. It should be one motion, swift and accurate.

I’m looking forward to printing this on my tote bag, and also on a spare t-shirt. Actually, I might use both designs and see which works best. I may also want to try my own designs!!! A bit xiao exciting šŸ™‚

Silkscreen Final

Before I go with the final prints, I wanna show some of the other prints made on newsprint which I find successful and worth showing hahaha

I decided to try printing my other design on my tshirt as a trial, since I feel that it is visually more appealing than my final one, even though my final one is more meaningful.

Turns out so well! I’m really proud of this. Unfortunately, this isn’t the print that I’ve chosen for my final. Because I love this design so much, I’ve made it into my own tshirt šŸ™‚

But then, there’s new instructions that there isn’t a rule that the design must be on a tote bag, so yay! I’m gonna wear this during submission and bring my tote bag as well!

IT DIDN’T TURNED OUT WELL D:

The tote bag was too uneven, and the design seems to be too complicated, so the results ended up wonky. It is hard to gauge how much ink I need, and my design on emulsion wasn’t aligned well on the screen, basically everything was wrong.

Here’s the video of the fail:

I had to go get another tote bag. I’M REALLY GRATEFUL TO JOY (not the teacher) FOR PROVIDING ME WITH HER TOTE BAG TO PRINT T . T

I went to try again, this time, pressing harder, using more ink, and aligning better

AND YES. IT TURNED OUT WELL

There we go.

My A4 printed pieces:

That’s this assignment, all done and well. Yay!


Before I end off, I’d like to share some extra stuffs we done too!

Making of the basement crew tshirts (2 different designs)

Outcomes

Timelapse of printing:


Final thoughts

What I learnt from this: I did silkscreen before in poly, using stencils. This time, I learnt that there is another way to do it, which is much simpler and more effective, that opened my eyes to new processes I never seen before. Also, I learnt to be more flexible with my adobe programs, as I have learnt photoshop now, I can mix both photoshop and illustrator skills together to create more things.

Working on silkscreen is so fun, especially when you get to keep what you make! The satisfaction from a successful print is like… woooooooooooooowz.

I also enjoy the process of creating the visuals from the quote. It is fun to conceptualise and create something unique and holds meaning too.