Month: March 2017

Interactive II Progress (28th Mar)

 

Winzaw and I looked at some tutorials found on youtube and tried to mash some of them together.

So the patch records voice for 8 seconds and playback in a robotic tune for twice over. and this audio playback will affect the particles to generate some visual feedback. Afterwhich the process happens again automatically. had to create a randomizer to randomize the voice playback so that it will alter the voice dynamically.

This is to see what we can do with particles and the audio feedback and how to combine. We will see how this can be applied to our ideas for the project

 

To see the rest of our progress:

21st Mar   28th Mar   31st Mar   4th Apr   11th Apr 14th Apr

Final submission

SHARING 11: Nightwalk

Night Walk” is an interactive narrative project by Cours Julien.

It is all set in this really cool district in  Marseille and there is a lot of atmosphere going on here with fascinating street art. There is some narration like an audio guide and then we can wander about to see the neighbourhood and all this has been captured at night. So with the photos, images, videos, sounds and interesting facts, we can really become like “online tourists” for awhile and just enjoy this place miles across the globe.

Totally cool and sharing here with all of you. And it is best to experience it with headphones by the way!

https://nightwalk.withgoogle.com/en/home 

ASSIGNMENT 4: Alpha Blending Faces

To see other Assignments:

Assignment 1: Magic Mirror  |  Assignment 2: Face Tracking  |  Assignment 3: Seflie Instructor  |   Assignment 4: Alpha Blending   |   Assignment 5: Pixelation Mapping

The assignment requirement for this one was to use MAX  to blend a face onto my own. So I chose Donnie Yen to morph into.

However, as mentioned in the video, I could not find a way to make the image still look clear even in the fullscreen. Because in CV jit there is a need to downsize the aspect ratio for detection. Hence this time I presented it as final effect and process together in the same video.

Sensing.

  • Coordinate info from cvjit.faces matrix used to calculate the position of face with respect to the screen
  • cvjit.faces coordinate info is then used to determine where to put Donnie Yen’s face so that it match mine.
  • MSP can detect if sounds are made  through the Microphone.

Effecting.

  • Donnie Yen’s face to follow my face.
  • Donnie Yen’s face will fade in upon a sound like I just tapping my computer desk and he will also fade out on second tap.

My patch is below where I have displayed clearly the steps involved. (1) CV jit faces (2) Donnie Yen’s face import and blur (3) Alpha blending

I learnt quite a lot from this assignment as it is really a next level of real time image manipulation. Hope to be able to advance further with the final assignment.

Interactive II Project Proposal (21st Mar)

Our team consists of Winzaw and Fabian.

Here is our progress in chronology:

21st Mar   28th Mar   31st Mar   4th Apr   11th Apr 14th Apr

Final submission

Main Idea:

Our project will be split into two separate endeavors which eventually be combined into a seamless interactive experience. We are interested in the distortion of sound and images that respond to each other in a cohesive manner. This video is an example to illustrate.

Aims:

  1. Sound. We want to have the patch constantly recording and playingback things people say to it. So this will probably be the basis for interaction. No buttons or sliders, just purely saying stuff to the patch.
  2. Visual. Based on the pitch, frequency of the sounds playingback, we will get the jitter to generate visuals, either in the form of particle systems or in the form of real-time distortion of the images captured via webcam.

 

Timeline:

  • 28th Mar. Working patch for sound recorder and playback (with/without distortion)
  • 4th Apr. Working patch for visuals (i) in terms of particle systems responding to the sound recorded or (ii) in terms of distortion of the webcam grabbed image (if that is possible)
  • 11 Apr. Connected patch for the two endeavors and fine-tuning the timing and sequencing for interaction

REFLECTION 3: Learning gameplay and narrative structure from Her Story

Awhile back we had this lesson where we played Her Story (2015) by Sam Barlow. I must say that the game is extremely simple in the interface and yet so intriguingly composed in narrative structure.

So you get access to files from a police dept which has been interviewing this woman connected to some sort of a murder case, I think, and you goal is to find out exactly what happened. You do so by searching the database for keywords connected to the case or words you think might be clues. Each of these videos are a few minutes and you have to read between the lines to know what to search next in the database. So for example she mentions some dates or names of people or an item during her interview and then you can search for those things and see what turns up on the monitor. However you can only open the first 5 matches of video to playback.

This is somehow rather connected to the current project that we are working on. And I have also created a search function within our game as well. Although my search function is rather primitive as compared to Her Story. But definitely there is much we can learn from the structure and gameplay elements of Her Story. What I really hope to emulate is the elegant display of the information and context. Because in the computer terminal of Her Story, there is actually some reflection on the monitor screen to show a bit of the lights in the room the user is using the computer within the context of the game. At times, reflection of the user’s face appears in the monitor screen as well. I found that really immersive and was really thrilled to notice this subtle detail as it certainly completes the look. There is also some fun stuff on the side going on like we can close the program and go to the recycle bin within the game’s terminal to see what files have been deleted and this adds on to the idea that we are really using this terminal to conduct the investigation.

The other thing about Her Story, is the cutting of the narrative. Almost like in film how we have jump-cuts. So here each interview video reveals only a portion of the information, but with just enough to allow you to formulate a certain idea or clue and search for a new piece of evidence. I think this formula is handled very well in Her Story, because I felt throughout the experience of playing it that I am getting more and more interested to get to the bottom of the mystery.

I feel that Her Story is certainly inspiring to play and learn from. Although I think we also wish to really devise our own game mechanics and structure, but hopefully we can successfully apply some of these learning points to our project as well.

SHARING 9: Looking at Narratives as a Design Problem

Working on the game now and thinking of the mechanics to create something playable.. based on much of the suggestions from the beta-testers we’ve had as well as Prof. Vlad who is giving us a lot of feedback on his experience in trying to learn the playing of the game.

A Case Study in Interactive Narrative Design by Carol Strohecker

would like to share, above, this study I read where the writer is engaged in a discussion on looking at these issues as design problems. and how we might then proceed to solve them. she raised some plausible solutions like pacing of interactive and non-interactive “chunks”, which is like a control of the time and learning curve of the user. As well as creating a dynamic feedback loop which will allow users to check progress and know that they are sticking to the program.

I think this is simillar to some suggestions we have had to create some progress bar thing in the game we are making. although i wonder if there can be an elegant representation of that…

SHARING 8: Flying High in Rome’s Interactive Web Experience

http://www.ro.me/film

 

Rome was originally intended as a concept album for a film.

This interactive narrative experience powered by integrated the use of webGL within the Chrome browser. Rich graphical interactive experience with great music soundtrack!

Director Chris Milk is an artist primarily working with technology-generated emotional resonance. The interactive narrative is inspired by the music of Danger Mouse and Daniele Luppi.

 

It is a fun experience really. I get to fly around and explore this bizarre yet beautiful world. And everywhere I go there will be animals galloping and birds flying around and the plants will just sprout out across the landscape that the mouse touches.