in Prototype 1

Final progress

Ever since the last process post in week 7 there have been some pretty drastic changes to my MAN work so I thought I would just do a final update post!

In the previous post (here) I tried out adding lighting, shadow and texture to the spheres in Processing. However I still wasn’t happy with the appearance and the lack of movement. Referencing some codes which contained a movement algorithm which is derived from orbiting of planets in outer space, I also scrapped the galaxy background and added in a moving starfield background instead which was simpler and looked nicer. The result was the following:

I was happy with the way the planets orbited and the appearance of the background. I initially added in the gradient colour textures to the spheres to give it a more interesting look but it wasn’t working for me so I eventually reverted back to white. I also didn’t like how when I put it in the media art nexus dimensions, the spheres would get stretched out. And because I was doing the code with a 3D renderer with so many spheres, my code would run really slowly every time and it was a hassle for me to test it and record videos. So at the last minute, I decided to scrap it almost completely and came up with something that was still similar but was simpler and honestly to me was a lot nicer.

This was the final video! I was much happier with how it looks and it was more cohesive and easy to use as well. I kept the background from the previous versions and instead of the spheres in 3D renderer, I stuck with the normal 2D renderer on Processing instead and was able to make the particles smaller and make the whole code run more smoothly. Instead of using the gravitational orbit movement algorithm I switched it to an attraction/repulsion algorithm and I much preferred how it looks. I also used some codes which mimic the flocking patterns of migratory birds for some of the particles (pink) as the pink particles represents the Vermillion Phoenix and I thought it was quite fitting and added some interesting effects to the composition.

While I have exported it as video for the MAN wall, if I were to run it as a code on processing it is actually an interactive generative art piece as the movement of the particles is mouse-controlled by the audience! If given more time and opportunities I would like to see how it is possible to input Kinect for this interaction.