Task: Using Max, create a virtual magic mirror that fades and brightens depending on the distance between the person and the mirror
Detection of the person’s face and the calculation of the size of the face, marked by the green box (sensing)
Inverting, changing the opacity of the image, rgb2luma and prepend frgb (effecting)
What is Max?
Max is a visual programming language that connects objects with virtual patch cords to create interactive sounds, graphics, and custom effects. Like a mind map, sort of.
This is my first experience with Max and I find it very different from the previous coding languages that we were exposed to in the previous semester. What I like about Max is that the mind map structure makes it easier to comprehend the function of the programme as a whole. However, the new terms, commands and flow of the programme was a little challenging to grasp.
Started off this assignment by learning the basic objects, messages, numbers and how to connect them using patch cords.
(right) The face detection is done using cv.jit faces, a function that scans a greyscale image for human faces. Hence, it is necessary to input jit.rgb2luma before cv.jit.
(left) The objects jit.iter and unpack separates the coordinates of the detected face into 4 values. To calculate the minimum and maximum area of the detected face (which directly proportional to the distance between the face and the screen), the x and y values are subtracted and multiplied. The resulting minimum and maximum areas are then scaled down to 0 and 1, before it is input into jit.op which controls the brightness.
Couldn’t get the programme to work for awhile because I mixed up the (n) objects and (m) messages, the number (i) and float (f).
Programme does not work with >1 detected faces on screen
Overall, it was a great learning experience and I look forward to exploring more features and possibilities with Max! 🙂
Concept: Simulation of real life conditions, with educational facts and data
What is it?
An interactive webpage that features two to three different modes, each with their own storyline – Forest, sea, and pitch-black mode. Players will be allowed to choose their modes at the start page. Each mode can be narrated in two different ways – purely text and voice-over narration.
Blindscape is a piece of experimental storytelling that takes place entirely through sound. The narrative is told from the point of view of a man in an authoritarian society who wants to escape his intolerable life by ending it.
I find this game story very engaging and immersive, despite its complete lack of visuals. It’s focus on sounds and narration, paired with some interaction along the story (like finding stuff in the dark) heightens your sense of hearing and touch, and results in a unique player experience.
Motion tracking – swimming gesture using the arms as the story progresses, possible tools/gadgets: camera tracking, gesture tracking software and PIR sensors (detects infrared radiation)
VR surroundings – for better visual and all rounded experience. Players can explore 360 degrees from one spot, to find clues or to learn about their surroundings
Enhanced sound effects and ambient sounds – especially for the pitch-black mode (e.g. rustling of the leaves, clear narration, bubbles underwater)