22 Oct Update

These weeks has just been trying to focus on building the materials and assets that I need.

Updates according to timestamp:
00:03 – 00:07 Plastic simulation 1 using noise
Using instancing was ODD. I was expecting the instances to move according to the noise it was fed but instead, it was just noisy particles coming out of the original model

00:08 – 00:12 Plastic simulation 2 using particles
Instead, inserting a particle system and populating the model with it and then adjusting the wind/turbulence played out better. Though if there is a way to redraw this such that it does rebuild itself in rectangles that would be better.

00:12 – 00:17 Bottle cap underwater simulation
00:17 – 00:22 Glass shard underwater simulation
This rendering works better for solids with distinct features such as the ridges around the rim

00:26 – 00:41 Moving the pt cloud 1
Connected to kinect and the pt cloud seems to just be rebuilding itself at the ‘new’ spot

00:42 – 00:57 Moving the pt cloud 2
Using an attractor makes it move smoother but now it’s just shooting everywhere

00:58 – 01:03 Populating a system with multiple assets

Creative Industry Report- Studio Roosegaarde

1633671843_879d3744-e89b-4c79-b24c-260215962d97_Page21-23

IDS Script:

Hi I am Yi Dan and in the most ideal situation, I would love to work in an art/design studio, more specifically with Studio Roosegaarde. They are based in Rotterdam, South Holland and have a pop-up studio in Shanghai, China. The term their space as a “social design lab” where they bring together designers and engineers together. It was founded by Daann Roosegaarde who personally grew up on the coast of the Netherlands and hence has first-hand experienced rising sea levels. Which is why he is passionate about connecting people and technology to innovate the fight against climate change. The projects that have been produced reflect the same sentiments.

Two examples are Grow and Urban Sun.

Grow

Grow brings to attention the huge space on Earth that is literally feeding us. While there are huge negative sentiments surrounding the agriculture practice, Studio Roosegaarde approaches this situation by focusing on “how to improve the situation at hand”. Backed up research, what Studio Roosegaarde did was to come up with “light recipes”. Using certain wavelengths of light, mainly those falling on the spectrum of red, blue and ultraviolet light, these allow the crops to actually grow more efficiently. This allows a reduced use of pesticides by up to 50%. They were also sure to focus it horizontally and only be seen from nearby to avoid light pollution.

Urban Sun

Coronavirus massively reduced the interaction we have with one another and in an attempt to rectify the situation, Studio Roosegaarde levages on a specific UV light at a wavelength of 222nm which in the presence of it reduces coronavirus by up to 99.9 %. Combating plastic barriers and ‘keep away’ stickers, Urban Sun is instead a better way of meeting and interacting. 

Just like the works presented, many of Studio Roosegaarde’s pieces all surround the notion incorporating aesthetic poetry with practical solutions. They remind me of one of my favourite books “Unweaving the rainbow” as they are able to ground their work in scientific research and this resonates with me a lot. In their practice they almost seem to be making prototypes of our future landscape and even present a hopeful vision ahead for our earth. 

It’s interesting, with them it’s like, the worse the weather, the better the art piece! 

 

8 Oct Update

Interview https://ntu-sg.zoom.us/rec/share/Y_QJz9PSy5zlO9tckY1FL1PPm741IfUMzkG79I4AS5745bZnkDrmxDdS25y9qLRq.qIjfbe_NZyqyynjy?startTime=1632402207000

Interesting pointers:
Bounce through different periods of extinction
For resurgence, colonise the deeper water first (low sunshine) then proceed upwards shallow waters
Grows very slowly, solitary
Symbiont shuffling in different environments
Algae take up a small sac(small space) in a coral (big space)

Line that I was very excited about: Structure wise, a mm thin active layer just building crazy structure, everything underneath is dead

Shinning line of polyps that travels upwards, grows vertically over a long period of time. In the moment, it looks like nothing is happening but actually tiny polyps are growing slowly to reveal “the bigger picture” which is their skeletons. Skeletons + environment of the space will correspond to the different eras of time.

Experimenting on projection

Enjoy the holographic-y looks and medium related back well to the topic

Thinking along the line of natural movement/organic growth/ wild occurrences, I was reminded of fishes, ocean creatures flocking. Collective animal(insects, birds, etc) behaviour arising from simple rules followed by individuals resulting in a large swirling pattern by the mass. Algorithmically 3 rules are separation, alignment, cohesion. We're Terrible at Distinguishing Real And Fake Schools of Fish | Smart News | Smithsonian Magazine Main Purpose: to protect themselves from the threat of predators
(could possibly be to stay warm also)

Relating it back to trash, problem that arise because we are trying to protect ourselves from inconvenience.

Potential areas of exploration:
Actually using Boids, mimics the motions of a bird flock using computational algorithm by Craig W. Reynolds in 1987
Cloth/Plastic simulation movement underwater
Rendering of BG – Photogrammetry? Could the BG also be affected by the number of people?

Timeline:
15 oct- decide on which software, get the boids thingy going
22 oct- tracking boids positioning to person, interactivity with others
29 oct- under water cloth/plastic simulation, trash models
5 nov- under water cloth/plastic simulation, trash models, putting it together for first prototype showcase (human reflection)

Documentation for SONDER

Currently, the big idea is:

Playing around with Unreal:

 

Wanted to try out the water plugin with Unreal and also the idea of floating actors
However at this point, everything felt slightly flat, and i wanted helium balloon-float not water-float

Also tried to connect Unreal w Touchdesigner via OSC potentially for the MicInput

Space

While building my space, I wanted to just see what I could with the free packages in the marketplace.

The following are the ones I used:

And this is what I created:

I retained the water element, but wanted to build on it. During the week of building, I wanted to focus on lighting. Maybe perhaps… because of this paragraph from a book I was reading (Steven Scott: Luminous Icons):

Obviously I am no Steven Scott, but I really like the idea of experiencing “something spiritual”, and also because it’s inline with the ‘zen’ idea of my project. While my project has no direct correlation with “proportion”, I still wanted to try to incorporate this idea with building lights of the ‘correct’ colours(murky blue-ish) and manifestations (light shaft, fireflies).

Projection

You would think just trying to get it up on the screen would be easy, just got to make the aspect ratio the same as 3 projectors.. thats what I tot too…

Anyway summary:
Spout- doesn’t work with 4.26, latest only 4.25
Syphon- Unreal doesnt support

Lightact 3.7- which doesnt spout, cant download on mac nor the school’s PC because it was considered unsafe due to the little amount of downloads

BUT I did try Ndisplay, which is an integrated function in Unreal
I could get it up on my monitor

Though I couldnt get it up on the projectors, but i think it could be due to my config files, i need perhaps another week or so on this

Interface

With the interface, this was the main idea:

Plan 1 would have made everything seamless because it is all integrated into Unreal. But ofc

So I am back with Plan 2, via OCS.

On touchdesigner’s end:

I have basically managed to split the audio that comes in via the mic into low, mid and high frequencies. For now, im only sending the mid frequency values over to Unreal because it represents the ‘voice’ values best. But, I had to clamp the values because of the white noise present. I might potentially use the high and low values for something else, such as colours? We’ll see.

For now, one of the biggest issues I feel is the very obvious latency from Touchdesigner to Unreal.

But moving forward first, the next step would be to try getting an organic shape and also spawning the geometry only when a voice is detected.

Since I wanted the geometry to ‘grow’ in an organic of way, I realised the best way to go around it would be to create a material that changes according to its parameters rather than making an organic static mesh because when it gets manipulated. by the values it’s literally just going to scale up or down.

This is how I made the material:

Using noise to affect world displacement was mainly the node to get the geometry ~shifty~. But I think as you can tell from the video, the Latency and the glitch is really quite an issue. And once again moving on, trying to spawn the actor in Unreal when voice is detected.

For that I am using Button state in Touchdesigner to do so. When a frequency hits a threshold, which assuming comes from the voice, the button will be activated. When Button is on click, ‘1’, the geometry should form and grow with the voice, but when the button is not on click, ‘0’, the geometry should stop growing.

 

Currently this is the button I have but it’s consistently still flickering between ‘on’ and ‘off’ so, I am starting to think maybe I should change to ‘When there is a detection of change in frequency instead, button clicks’.

To be continued…

Stage Creation/Platform!!!

Step 1: Idea

Step 2: Get materialsInstead of getting a foldable stage, getting crates were just much cheaper and makes more sense since they already had slits. There is space to insert the bulbs. 

Next was trying to get the electric circuit out and the placement of the bulbs. Mainly it was inserting 4 bulbs (2 at the edges, 2 in the middle). Initially I was thinking of doing 3, but the middle part of the crate was blocked with wood, instead of  trying to drill into the wood, (which would have been dangerous since it might not be able to support the weight it was originally intended for) I decided to just have 2 in the middle. 

The parallel circuit mainly consisted of the 4 bulbs, 4 starters, 4 ballast. Additional 12 clips to hold the bulbs in place.

Step 3: Installation left the paper sleeve on because 4 bulbs was brighter than I expected

Underside view of the crate

, Since there was too much light penetration through the slits, I think I will cover it up with a cloth to achieve a the more diffused look.

Left to do: Attach the 2nd crate below to make this platform higher, sand the crate for safety and add attach the cloth

Interface

Updates from the previous stage: I have added such that the spawned object is moves randomly within a bounding box, and experimented with the call to spawn the object.

What I realise is that we don’t speak in one breathe, so because we break our sentences up as we speak, this would cause many many objects to be spawned because of the pauses in between.

Instead, I think another more efficient way of doing this would be to have a button participants can press while they are speaking.

Right now, still not too happy with the way it moves, I wanted more of a zero-grativity floating kinda movement but its like doing the zoomies now.

Also, it’s not the way it spawns is also so ugly, it is just appearing at the target point, but I am looking at a more like blowing bubble like vibe.

This point, I was trying to decide how the users were going to hear the record audio (the muffled loud noise is me recording- not too sure what’s up with the recording), decided that the best way possible was to have the user go near the object. This way, specific audios can be played with specific spheres instead of just playing a random one, which might lead the user to hear the same audio twice.

So getting from this stage to the next consisted of:

  • Setting up 2 views: one for the player who is going to listen to the audio, other for the player recording (will touch on more later when incorporating the VR portion)
  • Having to spawn both the sphere and the audio into separate arrays and give them IDs, so that I can match the similar Index of the sphere to the similar Index of the audio

(in Level Blueprint)

  • Getting component to float & hover away only when audio has finished recording

(in Actor Sphere Blueprint)

(in Actor Sphere Blueprint)

  • Also need to stop the already spawned audio to not record again when the same button is pressed, if not they (meaning audio texture) will ovrwrite despite already setting different name files (Using ‘Called’ boolean)

(in Actor Sphere Blueprint)

  • Setting up the texture/fx to alert the player when they have made contact with the floating sphere

Made using particle system and + ‘Move to Nearest Distance Field Surface GPU’

Virtual Camera test:

At this point the virtual camera was way too laggy for anything. So initially I wanted to make it a mobile game but i would need to remove too many things to support the graphic feature of the mobile version, so I decided to go with VR instead.

The biggest challenges I faced at this stage were:
1. Making an asymmetrical multiplayer game- 1 VR 1 PC
2. Replication – to have the server and client seeing the same thing happening
3. Setting up a third-person VR view, because usually its always first-person

Final

Final Thoughts:

This project formed as it developed. Even till the last week, I was still deciding the set-up and the method of performing the interaction, whether it be virtual camera, or phone, or VR. Instead, what I knew I wanted was how the space was suppose to feel and the fundamental idea of breaking up the steps to having a simple conversation. With the vision of wanting others to find solace in the complexity of life, to understand that everyone out there is on their own journey, I wanted to set everyone back to the idea of ‘Talking’ and ‘Listening’ and ‘Seeing’. Technically, the whole project doesn’t have to be split up into these 3 “stations” and could potentially be just one entire game on one PC. However, I was determine to pull through with the setup in the dance room because its the concept that when times are tough, you need to split things up and intentionally take it one at a time. In the midst, to seek comfort in the realisation that each random passerby is living a life as vivid and complex as your own.

Final Presentation Slides

Links provided by LP during presentation:

Sweet Barrier Reef (2009) by Ken + Yonetani
http://kenandjuliayonetani.com/en/works/sweetbarrierreef/

‘The Lagoon Cycle’ Exhibition by the Harrison Studio.
https://theharrisonstudio.net/the-lagoon-cycle-1974-1984-2

An undersea robot has dispersed microscopic coral larvae
https://www.eurekalert.org/pub_releases/2018-12/quot-rmw121218.php

Going on from here, the main question I need to be certain of is what type of art am I aiming to produce? For sure the concept is set to plastic debris and its effect on corals but now I need to do more research on whether I wish to heighten awareness? create a product to help collection of waste? What kind of tone? Cynical, Serious, Hopeful?

Following on, these questions will guide my thought process and solid my idea.

13/4 Update

This point, I was trying to decide how the users were going to hear the record audio (the muffled loud noise is me recording- not too sure what’s up with the recording), decided that the best way possible was to have the user go near the object. This way, specific audios can be played with specific spheres instead of just playing a random one, which might lead the user to hear the same audio twice.

So getting from this stage to the next consisted of:

  • Setting up 2 views: one for the player who is going to listen to the audio, other for the player recording (will touch on more later when incorporating the VR portion)
  • Having to spawn both the sphere and the audio into separate arrays and give them IDs, so that I can match the similar Index of the sphere to the similar Index of the audio

(in Level Blueprint)

  • Getting component to float & hover away only when audio has finished recording

(in Actor Sphere Blueprint)

(in Actor Sphere Blueprint)

  • Also need to stop the already spawned audio to not record again when the same button is pressed, if not they (meaning audio texture) will ovrwrite despite already setting different name files (Using ‘Called’ boolean)

(in Actor Sphere Blueprint)

  • Setting up the texture/fx to alert the player when they have made contact with the floating sphere

Made using particle system and + ‘Move to Nearest Distance Field Surface GPU’

Virtual Camera test:

At this point the virtual camera was way too laggy for anything. So initially I wanted to make it a mobile game but i would need to remove too many things to support the graphic feature of the mobile version, so I decided to go with VR instead.

The biggest challenges I faced at this stage were:
1. Making an asymmetrical multiplayer game- 1 VR 1 PC
2. Replication – to have the server and client seeing the same thing happening
3. Setting up a third-person VR view, because usually its always first-person

2nd Presentation Slides

This presentation is going to be facts heavy, and it’s going to feel like a marine-biology science-y presentation. But, through my research, I picked out 3 fascinating facts about Corals I want to work with.

We had a previous sharing on rough ideas we wanted to work on for FYP, I presented on Coral Bleaching and the follow up question from that sharing was what is happening specifically to Singapore’s corals? https://www.sea-circular.org/news/cobsea-and-national-university-of-singapore-launch-database-of-marine-litter-research-from-13-asian-countries/

More on COBSEA: https://www.unep.org/cobsea/

Database found here:
https://docs.google.com/spreadsheets/d/1r4aCVQeCS1cj_Rhip82yVTNNnxkWDgFwbEIHCR_oASk/edit#g
https://cil.nus.edu.sg/research/special-projects/#polllution-from-marine-plastics-in-the-seas-of-asean-plus-three

*wide-ranging generalist coral species are build to survive in deeper waters

This idea is to highlight the problem about plastic waste, but in order to have it focused specifically on Corals, I am hoping to incorporate the concepts of Coral Dating or Coral Soundscapes. I am not too sure how I wish manifest it yet but here are some artist references:

Artist Mel Chin Floods Times Square With Virtual Reality Art to Sound the Alarm on Climate Change

Fish Hammer

Disruptive Devices

https://thestrawpocalypse.com/

30/3 Update

Stage Creation/Platform!!!

Step 1: Idea

Step 2: Get materialsInstead of getting a foldable stage, getting crates were just much cheaper and makes more sense since they already had slits. There is space to insert the bulbs. 

Next was trying to get the electric circuit out and the placement of the bulbs. Mainly it was inserting 4 bulbs (2 at the edges, 2 in the middle). Initially I was thinking of doing 3, but the middle part of the crate was blocked with wood, instead of  trying to drill into the wood, (which would have been dangerous since it might not be able to support the weight it was originally intended for) I decided to just have 2 in the middle. 

The parallel circuit mainly consisted of the 4 bulbs, 4 starters, 4 ballast. Additional 12 clips to hold the bulbs in place.

Step 3: Installation left the paper sleeve on because 4 bulbs was brighter than I expected

Underside view of the crate

, Since there was too much light penetration through the slits, I think I will cover it up with a cloth to achieve a the more diffused look.

Left to do: Attach the 2nd crate below to make this platform higher, sand the crate for safety and add attach the cloth

 

Interface update from https://oss.adm.ntu.edu.sg/ho0011an/ill-work-on-the-oss-ltr/

Interface

Updates from the previous stage: I have added such that the spawned object is moves randomly within a bounding box, and experimented with the call to spawn the object.

What I realise is that we don’t speak in one breathe, so because we break our sentences up as we speak, this would cause many many objects to be spawned because of the pauses in between.

Instead, I think another more efficient way of doing this would be to have a button participants can press while they are speaking.

Right now, still not too happy with the way it moves, I wanted more of a zero-grativity floating kinda movement but its like doing the zoomies now.

Also, it’s not the way it spawns is also so ugly, it is just appearing at the target point, but I am looking at a more like blowing bubble like vibe.

 

Reflections on Designing for The Digital Age Reading

These days, I find that the term ‘UIUX’ has just been randomly thrown around, a buzz-word of our time. I have also always just assume it was for only related to the tech field but after this reading, I have came to realise that this principle can also be applied to physical products, and services. Such designs are grounded by the root idea that it serves human needs and goals within certain constraints.

The reading makes mention of Alan Cooper’s Goal-Directed design.

One interesting thing I picked up from the reading is the question: Will it help users minimise work? under Principles. Work can be cognitive, visual, memory and motor. Truly this is why user survey is important. Sometimes, there are challenges that cant be seen because they are internal struggles. It seems that to have the best user experience is to have smoothest and simplest pathway towards the user’s goal. However, I have came to realise how skewed this idea can be. Speaking from personal experience,  sometimes, because an experience is so seamless, the fact that there is almost no “thinking” needed, I have managed to sign up for brands newsletter that I have no recollection off or auto check boxes has got me signing up for trials that I am not keen on. While reducing the work overload would help with the user interaction, however, going a further step to enriching the experience would be to balance the manipulative patterns. Afterall, if I have to take the extra steps to be aware of these dark methods used against me then it back fires the initial idea of minimising work.

Another interesting pointer shared is Framework definition under Process. It mentions that Persona and scenario are the primary drivers of basic framework. It is broken down to interaction framework, visual framework and industrial design framework. Being an interaction designer, sometimes I struggle with idea of “how functionality is grouped and how the personas will accomplish the most critical tasks”. I am starting to feel that there is a streamlining of design styles and everything are starting to look similar. EG old person naturally means bigger fonts, teens means more new technology to engage, kids means nothing sharp, intuitive movements. Hence, I think if I were to follow this scheme of using personas into my FYP, the personas need to be distinctive, unique individual, that should not be able to be blurred together with everyone else. Completely fictional stories of imaginary people based on little or no research bring no value for the design process and in fact, can in turn back fire. Poorly constructed personas is very much linked to the credibility of this technique.

I really enjoyed this quote: Any good method is a living thing that continues to evolve and grow. Indeed, while there are many recommendations, and structures around, I think human behaviours is consistently evolving and what this means for designs is to not just blindly follow trends or get stuck in frameworks, but to consistently be innovative that are targeted and well-researched for its user.