in 20S2-DM3015-TUT-G01-INTERACTIVE ENVIRONMENTS (TUT), Process, Project, Research

13/4 Update

This point, I was trying to decide how the users were going to hear the record audio (the muffled loud noise is me recording- not too sure what’s up with the recording), decided that the best way possible was to have the user go near the object. This way, specific audios can be played with specific spheres instead of just playing a random one, which might lead the user to hear the same audio twice.

So getting from this stage to the next consisted of:

  • Setting up 2 views: one for the player who is going to listen to the audio, other for the player recording (will touch on more later when incorporating the VR portion)
  • Having to spawn both the sphere and the audio into separate arrays and give them IDs, so that I can match the similar Index of the sphere to the similar Index of the audio

(in Level Blueprint)

  • Getting component to float & hover away only when audio has finished recording

(in Actor Sphere Blueprint)

(in Actor Sphere Blueprint)

  • Also need to stop the already spawned audio to not record again when the same button is pressed, if not they (meaning audio texture) will ovrwrite despite already setting different name files (Using ‘Called’ boolean)

(in Actor Sphere Blueprint)

  • Setting up the texture/fx to alert the player when they have made contact with the floating sphere

Made using particle system and + ‘Move to Nearest Distance Field Surface GPU’

Virtual Camera test:

At this point the virtual camera was way too laggy for anything. So initially I wanted to make it a mobile game but i would need to remove too many things to support the graphic feature of the mobile version, so I decided to go with VR instead.

The biggest challenges I faced at this stage were:
1. Making an asymmetrical multiplayer game- 1 VR 1 PC
2. Replication – to have the server and client seeing the same thing happening
3. Setting up a third-person VR view, because usually its always first-person