in Process, Research

Reflection on GL

Before Bin started on his presentation, upon hearing the title ‘Automated utopia’, the first imaginary that popped into my mind was humans not having to move at all and everything- like EVERYTHING was controlled by robots. We would literally only feel one emotion, which was happy, because literally we didn’t have to do anything and that’s just perfect.

After hearing Bin’s lecture, this imaginary has morphed drastically but more importantly, why?

The main takeaway I had from the lecture was how emotions can actually be conveyed through AI, through technology. Personally, I think I have limited my imagination of AI to certain areas. In my head, they are only prominent in warfare, in health diagnostics, basically, the “boring-stuff”. However, through his lecture, I came to realise that there is a lot of deeper questions to the existence of AI and the emotional sentiments behind this topic.

The first piece I am going to talk about that struck this chord was Doomsday Book’s The Heavenly Creature.

This film centers around a robot repairman who was sent to a buddhist monastery, who have seemingly achieved enlightenment due to his inherent lack of human flaws. But one thing that got me thinking is that no one ever mentioned that he was a robot in the film even though he looks so distinct from everyone else and also most characters (except the manufacturer and technician) treated him as if he was just like them- human.

So… is he life? Bin posted this question to us: what does it mean to be a life and have existence? does it limit to a physical body? It seems me that humans have been too used to having every emotion we can think of seemingly associated with a facial expression but Bin challenges us to think further- maybe robots and humans have a kind of  symbiotic relationship where AI could potentially help to achieve human true potential, no longer have to struggle over more laborious job and allowing us to be more introspective.

This reminds me of a movie, Baymax.

Baymax is an inflatable robot to serve as a personal healthcare companion to Hiro Hamada and they both share a comradeship that goes beyond, as seen in the ending.

*get ready for spoilers*

At the ending, Baymax and Hiro enter into a teleportation portal to save a girl who had been trapped in there for a long time and, just as they are about to leave the portal before it closes down forever, with the girl in hand, debris hits Baymax and he lost his abilities to fly.  In, perhaps, the saddest moment in Disney history, Baymax tells Hiro that he can use his rocket fist armor piece to get Hiro and the girl to safety, but Baymax cannot fly any longer, so he will stay behind.  Then, the killer moment, Baymax asked Hiro to tell him that he is satisfied with his service so that he can power down. Which. He. Did.

This to me foreshadows the emotional aspects that AI can bring out, which Bin is trying to suggest. Baymax, the robot, gave his all to his owner even though it meant he could ‘die’. Yet, knowing that Baymax is a robot and is not life, still makes this such a sad ending? Also, to add, the nature of technology, it almost seem as if the robots always give and the human always benefits.

The second piece that made me realised there are emotional qualities in data is the piece by Alexandra Daisy Ginsberg, the Substitute.

This piece explores a paradox: our preoccupation with creating new life forms, while neglecting existing ones. A northern white rhino is digitally brought back to life, informed by developments in the human creation of artificial intelligence (AI). Based on research from AI lab DeepMind, the rhino performs as an artificial agent, an autonomous entity that learns from its environment. A life-size projection, 5m wide, shows the artificial rhino roaming in a virtual world, becoming more “real” as it comprehends the limits of the space.

This sensitive use of data got me thinking- how does one truly feel? Why is it even though we know and can see that a rhino is being formed out of pixels, we still feel for the rhino? My theory is that the pixels may not be able to relate to us but yet this technology is able to output a theatrical experience that help us feel.

I like to draw a parallel to the piece ‘‘We Live in an Ocean of Air’ which was first showcased in Dec 2018 at the Saatchi Gallery, London.

The motivation behind this piece is to reconnect humans with nature by showing the invisible, by materialising the symbiotic relationship with the plant kingdom.

In this VR space, we would be able to enter the tree in which we will see vast flow of forces of a living tree, and how the tree survives. But of course, what we see is just a figurative representation of ‘photosynthesis’ and this is not how photosynthesis actually works.

But as mentioned in the book “ways of seeing” even though what we see might not be objective facts, but the testimonial value makes images extremely powerful. It was actually noted in a report by Medium Publication that upon entering the tree, it looked very solid and in fact the writer felt an emotional resistance, which goes to show, the creators have successfully through VR managed to engage participants emotionally to reconsider their position in relation to the plant kingdom. It provides an alternative platform to view the challenges our planet faces in the 21st century and helps us reflect on our dependence and responsibility to the organism we share the earth with, through the help of technology.

In conclusion, just as we created the bucket because our hands cannot carry water, and the wheel because our legs are imperfect, I believe we might one day rely on AI to help us express our feelings, which we technically already am with these artworks are fore-runners to my claim. Currently, technology simulates our visuals, and our hearing but have yet to reach the stage where all our 5 senses are engaged. I believe when this happens, maybe then we might be close to the idea of ‘Automated Utopian’. We might not only be able to communicate ideas we choose to bottle up, or be able to decipher when another is feeling sad, but also AI could help simulate world beyond the limits of human perception.