Week 11 – The car as a driver

This week’s focus revolved around 2 sections which were,

  1. How the car signals its intentions to pedestrians and other cars.
  2. How the car signals its intentions to its passengers.

 

For section 1, Teague Labs has done research on this very issue, however their project is focused on a farther future when traffic lights are phased out. This requires the car to perform the job of a traffic light when it interacts with pedestrians in intersections.

https://medium.com/teague-labs/crossing-the-road-in-the-world-of-autonomous-cars-e14827bfa301

My project is more focused on the transitional period when autonomous cars still share the road with existing driven vehicles and existing infrastructure. So I’ve went around Bugis area to observe traffic. I went on an overcast day and the cars’ windshield reflections were obscuring my view of the drivers. In a way it almost felt like the future was here, I could only see the car and not the driver. Pedestrians that crossed the street and jaywalked were able to sense the cars that were in their path and vice versa. After years of conditioning, we are able to discern the intentions of a driver just by how much the car is accelerating or decelerating. This led me to believe that the current signaling systems are sufficient to present intentions for both parties.

Echoing this sentiment, Volvo has also intended for their autonomous cars, to look normal on the outside to prevent their cars from being treated differently.

https://www.theguardian.com/technology/2016/oct/30/volvo-self-driving-car-autonomous

 

 

Sketching and thinking while on the street to not look suspicious in public.

There was however one situation that might warrant extra help to signal a car’s intentions. For example when the car spots a pedestrian on a street without traffic lights, it can stop and use extra signaling lights to tell the pedestrian to cross. However after researching, a user from Reddit discouraged such behavior, saying that it brings the pedestrian harm because the cars on the opposite lane does not know our intentions of letting the pedestrian cross. This might result in the pedestrian being run over in the other lane.

For this transitional period, the current signaling systems are enough. The only adjustment needed is more about the car having a quiet electric engine. Thus extra sounds are needed to increase awareness amongst pedestrians.

 

For section 2, I’ve made a mockup of the car’s windscreen to let the car signal its intentions to passengers.

Sketch of information categories for windscreen mockup, symbolized by shapes to facilitate communication between passenger and car.
A rough mockup of the car windscreen UI. Original screenshot from video, Driving from Changi Airport Singapore to NUS along ECP, MCE and AYE by Nway Oo Ko from youtube

After a while I realized that this UI is meant for multiple users to view, thus it made no sense for the UI elements to be overlaid on the real world because parallax issues will only let one viewer see elements overlaid on the correct area. This gave me an idea for the UI to be non specific, only highlighting a general area such as left, middle and right.

Rough mockup of a non specific UI. Original image form video, Driving from Changi Airport Singapore to NUS along ECP, MCE and AYE by Nway Oo Ko from youtube

The orange glow at the bottom lets passengers know the car is watching this bus as it merges into the car’s lane. However for the passenger, flashing this is unnecessary as they can already notice the bus if they are looking at the windscreen. If they aren’t looking at the windscreen, sudden flashes of color may cause unnecessary anxiety, rather than reassure them.

Even so, if the UI is persistent, it will become annoying after a long period of time. So i did some more research and found out that passengers are only worried at the initial stage before they even get into the car. This article mentions users becoming more comfortable with the car after experiencing the car driving itself.

This led me to focus more on what happens before the trip. One possible way is to establish a relationship with the car by chatting with it while it comes to you. This is currently done by the Grab app, and can be brought forward to the driverless age. One distinction can be to include a livestream of the car as it makes it’s way to you.

Chat mockup as car arrives

 

Closer view of livestream. An ad appears on top of a HDB block, a possible space for AR Advertising

This allows the passenger-to-be to see what the car is sensing and be more assured.

 

Next week I’ll be focusing on how the car can help you discover locations.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Leave a Reply

Skip to toolbar