MAX Assignment 2: Eye Tracker

Task: Create an eye or movement tracker that follows you as you move left and right

  1. Detection of your face, calculation of the x-coordinate of the midpoint (sensing)
  2. Uploading of eye movement clip, scale the frames of the video to the position of your detected face (effecting)

The first portion of this assignment involves detection of the face, inverting and extracting the coordinates using jit.iter, similar to the magic mirror.

eye1

New objects/messages used in this assignment: jit.movie @autostart 0, frame_true $1, bang, getframecount

eye3eye2

(left) Calculation of the x-coordinate of the midpoint of the face detected: adding the top left and bottom right values then divide by 2.

(right) Uploading of video using read and jit.movie, play the video by a specific range of frames determined by the scale.

 

assignmt2

Problems encountered:

Jumping of the frame when no face is detected -> abrupt
Solution: if value <=0, then play frame (middle)

(To be updated!! Have yet to get this part^ to work)

MAX Assignment 1: Magic Mirror

Task: Using Max, create a virtual magic mirror that fades and brightens depending on the distance between the person and the mirror

  1. Detection of the person’s face and the calculation of the size of the face, marked by the green box (sensing)
  2. Inverting, changing the opacity of the image, rgb2luma and prepend frgb (effecting)

What is Max?
Max is a visual programming language that connects objects with virtual patch cords to create interactive sounds, graphics, and custom effects. Like a mind map, sort of.

This is my first experience with Max and I find it very different from the previous coding languages that we were exposed to in the previous semester. What I like about Max is that the mind map structure makes it easier to comprehend the function of the programme as a whole. However, the new terms, commands and flow of the programme was a little challenging to grasp.

untitled
Started off this assignment by learning the basic objects, messages, numbers and how to connect them using patch cords.

untitled3   untitled2
(right) The face detection is done using cv.jit faces, a function that scans a greyscale image for human faces. Hence, it is necessary to input jit.rgb2luma before cv.jit.

(left) The objects jit.iter and unpack separates the coordinates of the detected face into 4 values. To calculate the minimum and maximum area of the detected face (which directly proportional to the distance between the face and the screen), the x and y values are subtracted and multiplied. The resulting minimum and maximum areas are then scaled down to 0 and 1, before it is input into jit.op which controls the brightness.

ezgif-com-crop

Problems encountered:
Couldn’t get the programme to work for awhile because I mixed up the (n) objects and (m) messages, the number (i) and float (f).

Limitations:
Programme does not work with >1 detected faces on screen

Overall, it was a great learning experience and I look forward to exploring more features and possibilities with Max! 🙂

Narratives for interaction – Ideation (1)

Topic: Nature/ Natural environment

Concept: Simulation of real life conditions, with educational facts and data

What is it?
An interactive webpage that features two to three different modes, each with their own storyline – Forest, sea, and pitch-black mode. Players will be allowed to choose their modes at the start page. Each mode can be narrated in two different ways – purely text and voice-over narration.

Inspirations:

Blindscape is a piece of experimental storytelling that takes place entirely through sound. The narrative is told from the point of view of a man in an authoritarian society who wants to escape his intolerable life by ending it.

I find this game story very engaging and immersive, despite its complete lack of visuals. It’s focus on sounds and narration, paired with some interaction along the story (like finding stuff in the dark) heightens your sense of hearing and touch, and results in a unique player experience.

ff591603d0159d1fa5ca1ba6fb1ab5e8 8fc1b7d5d97a32925afe0ad075cd9190 11444253fe47897f0db5a430481ccead

 

Interactivity:

  1. Motion tracking – swimming gesture using the arms as the story progresses, possible tools/gadgets: camera tracking, gesture tracking software and PIR sensors (detects infrared radiation)
  2. VR surroundings – for better visual and all rounded experience. Players can explore 360 degrees from one spot, to find clues or to learn about their surroundings
  3.  Enhanced sound effects and ambient sounds – especially for the pitch-black mode (e.g. rustling of the leaves, clear narration, bubbles underwater)