This week’s focus was on the dashboard, due to it’s previous function of being a control center for the driver. For an autonomous car, the driver’s role can still exist, and the dashboard can also reflect the way the way the car sees the world.
From this video by Sebastian Thrun and Chris Urmson, Google’s autonomous car uses LIDAR to scan in lines to perceive its environment. This reminded me of the contour lines used in maps.
This gave some inspiration for the design of the dashboard, which I made a mockup for.
The lines give a textured feel when touched and also act as a unifying element that can blend interface elements together.
Another idea was for the car to also adjust it’s stance to signal to pedestrians its intentions. Akin to how an animal or person changes its posture or stance before committing to an action. It might not work in practice but it was a starting point of how it can interact with pedestrians and other drivers. This aspect of the FYP will need more work.
Peer’s suggestion this week was to also explore how the car can let you discover places. This enables the car to function as a chauffeur and also letting the passenger be more aware of what’s around the usual route that’s taken.
Another suggestion was to also look at different modes of transport to see how passengers behave in those situations. This allows different types of information to be prioritized to appeal to the passenger at that moment.