Final Video and Pictures for SONDER

Final Thoughts:

This project formed as it developed. Even till the last week, I was still deciding the set-up and the method of performing the interaction, whether it be virtual camera, or phone, or VR. Instead, what I knew I wanted was how the space was suppose to feel and the fundamental idea of breaking up the steps to having a simple conversation. With the vision of wanting others to find solace in the complexity of life, to understand that everyone out there is on their own journey, I wanted to set everyone back to the idea of ‘Talking’ and ‘Listening’ and ‘Seeing’. Technically, the whole project doesn’t have to be split up into these 3 “stations” and could potentially be just one entire game on one PC. However, I was determine to pull through with the setup in the dance room because its the concept that when times are tough, you need to split things up and intentionally take it one at a time. In the midst, to seek comfort in the realisation that each random passerby is living a life as vivid and complex as your own.

13/4 Update

This point, I was trying to decide how the users were going to hear the record audio (the muffled loud noise is me recording- not too sure what’s up with the recording), decided that the best way possible was to have the user go near the object. This way, specific audios can be played with specific spheres instead of just playing a random one, which might lead the user to hear the same audio twice.

So getting from this stage to the next consisted of:

  • Setting up 2 views: one for the player who is going to listen to the audio, other for the player recording (will touch on more later when incorporating the VR portion)
  • Having to spawn both the sphere and the audio into separate arrays and give them IDs, so that I can match the similar Index of the sphere to the similar Index of the audio

(in Level Blueprint)

  • Getting component to float & hover away only when audio has finished recording

(in Actor Sphere Blueprint)

(in Actor Sphere Blueprint)

  • Also need to stop the already spawned audio to not record again when the same button is pressed, if not they (meaning audio texture) will ovrwrite despite already setting different name files (Using ‘Called’ boolean)

(in Actor Sphere Blueprint)

  • Setting up the texture/fx to alert the player when they have made contact with the floating sphere

Made using particle system and + ‘Move to Nearest Distance Field Surface GPU’

Virtual Camera test:

At this point the virtual camera was way too laggy for anything. So initially I wanted to make it a mobile game but i would need to remove too many things to support the graphic feature of the mobile version, so I decided to go with VR instead.

The biggest challenges I faced at this stage were:
1. Making an asymmetrical multiplayer game- 1 VR 1 PC
2. Replication – to have the server and client seeing the same thing happening
3. Setting up a third-person VR view, because usually its always first-person

30/3 Update

Stage Creation/Platform!!!

Step 1: Idea

Step 2: Get materialsInstead of getting a foldable stage, getting crates were just much cheaper and makes more sense since they already had slits. There is space to insert the bulbs. 

Next was trying to get the electric circuit out and the placement of the bulbs. Mainly it was inserting 4 bulbs (2 at the edges, 2 in the middle). Initially I was thinking of doing 3, but the middle part of the crate was blocked with wood, instead of  trying to drill into the wood, (which would have been dangerous since it might not be able to support the weight it was originally intended for) I decided to just have 2 in the middle. 

The parallel circuit mainly consisted of the 4 bulbs, 4 starters, 4 ballast. Additional 12 clips to hold the bulbs in place.

Step 3: Installation left the paper sleeve on because 4 bulbs was brighter than I expected

Underside view of the crate

, Since there was too much light penetration through the slits, I think I will cover it up with a cloth to achieve a the more diffused look.

Left to do: Attach the 2nd crate below to make this platform higher, sand the crate for safety and add attach the cloth

 

Interface update from https://oss.adm.ntu.edu.sg/ho0011an/ill-work-on-the-oss-ltr/

Interface

Updates from the previous stage: I have added such that the spawned object is moves randomly within a bounding box, and experimented with the call to spawn the object.

What I realise is that we don’t speak in one breathe, so because we break our sentences up as we speak, this would cause many many objects to be spawned because of the pauses in between.

Instead, I think another more efficient way of doing this would be to have a button participants can press while they are speaking.

Right now, still not too happy with the way it moves, I wanted more of a zero-grativity floating kinda movement but its like doing the zoomies now.

Also, it’s not the way it spawns is also so ugly, it is just appearing at the target point, but I am looking at a more like blowing bubble like vibe.