I am VEJEE: valerie and shampoo bottles

THE FINAL INDIVIDUAL PROJECT for interactive 2,

designed to confuse the blur me.

FIRST ENCOUNTER: it didn’t work at all, huh bummer. At first, all the patches were not sensing me at all.

I tried the different patches thinking that they would work, but the red regions were always highlighted and yet did not send bang signals or play the audio input.

Then I remembered motion region was a way to help MAX distinguish between what was there before and what we had just put into the screen input. There I looked into it and there and behold, I refreshed the screenshot and adjusted the threshold (to 33) to which it would sense the background and me well.

Subsequently, I looked into the more important parts of the project.

Firstly, Under pGrabSequence is where motion boxes was programmed. At first, I saw a list of numbers I could barely put a title on and realized shortly after that it was the coordinates of the motion region boxes that would sense the different things I put in it!

I then added 4 more boxes as there were supposed to be 8 music files compiled with it. 

Then came the part where I had to match the motion region boxes with the music files! I had a very hard time trying to refer back and forth between windows of trigsound playsound and motionregion so I decided to remove that function and add that to the main patch itself!

In that, I added one button which would receive messages sent by the motion region and would send ‘1’  to the audio file whenever there was a threshold difference in the image coming in. However, the audios were overlapping with each other and getting very messy even though there was nothing in the frame! so there was a problem in getting the audios to stop running whenever I start it

Then I added the second bang which turned out to be a better addition as then the audios would then switch off by themselves as they received a’0′ from the bang when the motion regions did not receive a threshold difference in the image previewed!

Hence the final product, I did it with my arms and spare shampoo bottles I had lying around 🙂 enjoy!

EM3/ IAmMyFather

This class, we learnt how to ‘change our faces’ through face detection in MAXmsp! In the patch given, the programme seemed fine, however, it wasnt perfectly smooth and the face was not adjusting to my face proportionally so i tried to fix it!

Problem 1: Resizing.

i tried many things shifting the numbers of the jit.matrix and jit.movie and the boundary steps in the patch.

however, when I tried to toggle with the boundary step – ($1 $2, dsdimend $3 $4) and changed it to things like adding in dimensions of ($1 + 50) or bigger numbers like 200 it didnt really help!

that’s why I tried adding Jit.matrix as shown in the red circle as highlighted and it worked better after that! not perfect but close enough. so far, i can only think that the jit.matrix helps to limit the space of the image in comparison with my face!

and sooooo this is the final product of my face mashed up with my dad’s and may i say we look alike with the likeness of a pig :”))  #FAMILY

I hope I can explore this function in the near future and make it applicable to more people on the screen!

 

 

FINAL SUBMISSION: DDD DAILY DRINK DILEMMA

Done by: Tan Chloe and Ye Min Valerie

Presenting our Final project powerpoint slides and process video!

 

And a clip of our project during submission day! (CR: Jakeru)

MAX was very tricky to work alongside other apps like arduino and faceOSC but through much proofreading the days before final submission, we (mostly chloe) really tried to find out why the applications were not communicating to each other as smoothly as they should. Thus, there were alot of delays and different bang buttons added onto the patcher.

If we were to do this project again, we hope we can find better applications and sensors to better assist the patch! and maybe we could even venture into things better suited for the patch such as simple sounds or even a music piece!