To be frank, I am not sure what to do with this patch to make it unique, so I added my previous patch that track my eye and did a minor adjustment so that the repetitive pattern in this patch will make the eye to look like a wall of continuous eye, since they are happening at real time, it feels like it is the nightmare of those trypophobic.
this still work with parts of the previous patch, by zooming more into the eye and changing the darkness of the clip to make it work.
In this Exercise, I’ve used the location of the face tracker and cropped to only the eye area, scale it accordingly and delay video of the eye by using jit.matrixset, after that, I mapped it onto the original location of the face, creating a trippy effect of a four eye monster.
There are a few things that I’m unable to solve.
1 – the alphamask will give a rectangular box and im unable to remove it even with blur.
2-the additional eye is too sensitive and it look like its vibrating, maybe i could line it afterwards.so I added the eye tracker from the value of the jit.face box, while doing it, i realise it looks really cool on this effect for just using only the jit.face and scrdimstart and scrdimend it. it feel like a really old way of fliming drunken scene.
Through this disastrous week, we focused on getting the the final batch of butterflies for the presentation, writing the max patch, Arduino code and also purchased a 19 inch monitor which got ruined by a single drop of raindrop and also, setting up the “habitat” for the butterflies in the acrylic case we made.
As for the butterflies, since we released all of our 8 butterflies last week due to the agreement with Mr Foo, the conservationist, we have 0 butterflies on hand for our project and as he agreed with us previously, he would give us in batches, and this week he direct me to a school teacher Sarah, who gave me two of the Leopard Lacewing Chrysalis. Since we know two is not enough, we decided to try our luck to catch some more butterflies a few days before the presentation just make sure they will be at least visually attractive for the final submission.
Both of the butterfly is female although we hope for them to be male which are more visually appealing. however, one of the two butterflies which Sarah gave us have deformed wings and is not able to fly very well. it is still relatively healthy other than that.
And we went to Yewtee park on 3 separate occasion, on the first occasion, we bought a short net which was unable to reach the butterfly and therefore we bought a long extension for the net the following day.
Problem: Not enough butterfly.
Solution: Catch them
During this week, we also use code that allows to use jit.blob and jit.op @absdiff to prepend in the colour in the frame and enable us to locate the center of the blob, the problem here is that we can only unpack the multiple blobs into one XYZ value. Therefore we looked online and tried many different patches and finally came to using [jit.spill @plane(number)] -> [unpack 0. 0. 0.] to unpack many blobs into many XYZ coordinate. hence enable us to track multiple blob’s location at once.
using those cordinate, we also followed a tutorial that taught us how to use jit.lcd, we use the cordinate from the blob centroid to draw the lines in the lcd, we also use timer function to change the RGB value of the lines created in the lcd
up till here, we found out from Prof LPD that we we cant or it is very hard to map many point that attract the particle system as they only “move to destination” but not really is “attracted to” the point. so we decided to find another patch to learn and build our patch on.
This is a screen shot of the final patch we used for the presentation.
just a day before the presentation, we decided not to use the particle system as we think it is not that more is better, it is the effect that we wanted that is important. basically, we found a way to change the camera to using external camera that we have(vdev/input), since the external web-camera is longer in the width, we decide to let it lie by the side and rotate the orientation and scale it accordingly to the screen size we have. (jit.rota) and also changed the original patch from a standard (importmovie colorswatch.png bang) to (qmetro 500 -> jit.noise 3 char 6 6) which produce a new 6 by 6 random RGB boxes every half a second and thus changing the effect of the fluid effect.
Next after we solved the fluid effect, we build the very simple code that uses (serial c 9600) to send a integer over to Arduino which is linked to a relay that controls the 220v table fan which turns off after 8000 ms
And that is roughly it for the codes that we wrote this week, we also settled the logistic part like the screen with a self made stand for the visual generated to be place under the butterfly casing, the flower for the butterfly to stay onto while adding to the overall visual appeal for the whole project, lastly we searched online for the species of butterflies we have and made a information about our project for the butterflies.
We purchased the second hand monitor on carousell and it was working, WAS. and then on the way to school while raining, a drip of rain water goes into the system and killing it once we turn it on. talk about unluckyness. so now, we’ve got a spoilt monitor screen, and we still need one so we searched online for more monitor, in the end we borrowed a 22inch monitor from the IT department which is slightly oversize but great nonetheless.
While I was doing this, SuHwee was arranging the flower carefully, although it may look easy to do so, having the balance and the need to consider the size and the butterfly is quite hard.
we put the screen and the butterfly box together, did up the information board to it and transferred the butterflies in, tore off the paper protecting the acrylic and we’re done!
We will update/upload the video for our project soon!
This is the first time I’ve seen the butterfly emerged from chrysalis, these chrysalis of lime butterfly were given to us by our friend YaYu.
These 3 different species were given by the conservationist Mr.Foo, there were a total of 7 Chrysalis, 5 Leopard Lacewing, 1 Plain Tiger and 1 Lemon Emigrant. All 5 Leopard Lacewing had emerged this week and the other 2 will be coming out tonight or tomorrow morning.
Did a lot of butterfly related task this week which includes getting the chrysalis from Mr.Foo, setting the chrysalis up for documentation, waking up way too early to get the video of the eclosing process. laser cutting and build the final casing for the butterfly and presentation, Researched on what butterfly eats and how to feed them. buying Curry leaves from supermarket with vegetables for the butterfly to live with. And tried to get a second hand monitor screen from an online seller, the seller did not turn up and did not reply. Our 160 degree wide angle web camera had arrived from china and we will test it out really soon with Max.
While I was writing this post, The Lemon Emigrant eclosed! (and I diden see it nor video it!) now that the Lemon Emigrant came out, the last one that will be coming out will be the Plain Tiger.
Also for coding, we further researched about our max patch and spend most of our time studying and trying to merge/tweak the standard patch as well as what other people did for their project, some of the patch were too hard to understand until today’s lesson when LPD explained most of what the patch does then we understands(slightly) what the patch does.
From this week to next week we will focus mainly on the max patch and get something that could roughly work in terms of butterfly/motion tracking, visual generation and output to control a real life appliance using UDP or the wireless I-Cube X.
Write up a rough Max Patch for Motion tracking using camera and particle/visual generation from the point/blob that was tracked by the motion tracking.
Catch more caterpillar/cocoon from different area and see how long they need to become butterfly + research more about butterfly.
Actual Milestone Done in Week 10 -11
Purchased the materials and built box for the project using acrylic.
Purchased a mini webcam and tested it with Max, the viewing angle is
Tried to find caterpillar off the wild and also met two community gardener in different location who said there wasn’t any caterpillar seen recently.
Managed to get two chrysalis of butterfly from friend.
Managed to talk to a conservationist in Singapore and he agreed to give us some chrysalis of butterfly of different species for our project.
searched for particle system and tutorials for Max and had been trying to make them work.
We went to Dama in Ubi to purchase our acrylic to build the box for this project, the box have the length of 2.5ft by 1ft by 1( 76 x 30 x 30 cm ) and net will be attached to one side of the box for ventilation.
Helping the Chrysalis to shift the house from the insect container to a proper container for better documentation process and prepare for the timelapse during the emerging process.
The lid of the container was replaced with a net hotglued in place for proper ventilation for the Crysalis.
Also, I’ve joint a butterfly community in singapore and tried to ask for people to help me to get caterpillars of chrysalis of butterfly, and Mr Foo, a conservationist in Singapore agreed to give us some chrysalis for the project!
During this week, I tried out NPC chat by colliding with them, the Render Texture Camera of Unity is applied to the NPC chat so that their “Display Picture” is rendered in real time which is showed in the dialogue, through this, Vladimir suggested that I should make it mouse controlled camera as most of the games in the market is mouse controlled, so it will be more user friendly to so to. I’ll try to figure this out soon.
In this, this will be the basic system needed to complete the game, By talking to one NPC, another will be spawned or despawned, if i place the character on places where the spawning of new character cant be seen it will create a sense of exploration as the user will see different things in the same place even though they past by the same area, the things there will be different. i think It would be interesting and user might just want to see different stories in the game. By having this system of character spawn and despawn, i could make a game that is linear and there could be path for different story depending on which npc the user talked to and spawn the path for them while closing all other path by despawning the npc for the other path. I think this will be fun as many user could come to different story line within one game.
Exporting to phone is not an easy process as I need to download many different softwares like JDK (Java Development Kit) to make it work, I am not sure how does it even work when it failed countless of times, I also switched my phone into developer mode and somehow the app appeared in my phone when I’ve export it to my computer. Most of the thing I’ve followed this video, and i just try again after it fail.
This is what I have after the export, It is basically too lag to be played properly and the experience of the game if it lag is not worth the time playing it and It will be rage inducing if it is at this level of lagness.
After this, i decided to scrape the idea of making a phone game and focus on computer platform as computer have much higher processing and rendering power.
This the final model I’ve built for ADM RPG, it took more than a total of 5 weeks to build it to this scale and the only thing that was not build by me is the railings found on the roof top and level 1 and 2 inside ADM. Although there are still many areas which is not completed like 2nd floor Foundation side(the scale is wrong but everything else works) and 3 and 4th floor, I could not go back to edit it as my Sketchup trail expired, tried many ways to reinstall the sketch up but apparently it doesn’t work once I updated to Sketchup 2017. This is a animation I did within Sketchup animation program. Since adding details is a never ending process(i could add much more detail down to every single object in adm, all table placement and sprinkler on the rooftop and such) but I need to proceed to next step else I will never complete the game, so I will leave the model as it is as it is already playable. IT IS NOT PERFECT BUT ITS WORKING AND COMPLETED!
This is the process in which i added the collision to the fence(it is downloaded and doesnt have a mesh that is usable as a collider within Unity, therefore i need to create my own).
Explanation of what I did in this video:
Create a rectangular box and scale it according to the angle and size of every fence(which tooks alot of time, this is a timelapse video of 30x speed)
Place the rectangular box covering the fence
Add Physic: “Mesh Collider” to the rectangular box
Turn off the “Mesh Renderer” so the box disappear from sight
Collision still work after Renderer is turned off.
Try the collision out and spot for possible bug(stuck and such)
After this process, I’ve got full collision system that all the current object have collision, the walkable areas are walkable while places like outside of ADM area and water at Sunken Plaza are off limits.
Next up I will add NPC and interactable objects to the game, write in narrative for them and refine the game till the end of semester.
After trying out the tutorial in the last post, I found another one which is much better!
I’ve went through this tutorial twice by accident as my save file got overwritten by another file when I tried to explore the asset store. so yeah, this is the most basic of Unity, the way to move something without animation.
ANDDDDD THE MAIN POINT OF THIS POST!
This is Unity Chan!!!
most probably the character I am going to use for my 3D game, Unity Chan is a free to use character model by Unity(I am sure of it as I’ve read all of the licencing, Grant of License, Condition of use, as well as their official page FAQ.*They even allow it for normal commercial use*)
After this Tutorial, this is still insufficient to create the basic control of the game as there are many mechanics lacking (camera panning, the character dont even move well), but I still gain important knowledge from here and trying it out, this is a tutorial focusing on animation than movement.
I’ve also did my scripting and modified it to make it animate and move slightly better than in the tutorial, so yeah, that’s about it, the exploration of unity and learning lots of stuffs in it, maybe one day I could be a game producer 😉
After a few more attempt and followup on the project, i managed to make the camera to follow and pan according to the character movement which i really like it, this should be the base to create the final project.
Featured image warning: “sorry I retard I laughed at myself LOLLLOLL”
This Exercise was built on top of the previous one where we can use the majority of the exercise 1’s code and add on the playback of another video to it, playing the frame corresponding to the x position of the face detected.
I’ve also added the “if” statement which made the video to “look” at the middle when no face is detected and “line” it so that the face will turn gradually(only somewhat as i do not have enough framecount in my clip to make it really smooth)
in this exercise, I’ve learnt how to:
playback a certain part of video
use the “if”, “then” and “else” statement properly in max