DOW – Senses: Project Sense

Sense is a concept design by CD&I Associates created back in 2009. It aims to provide users with a more immersive experience while online shopping, watching movies or playing games, amongst other screen-using activities. Sense aims to provide haptic, thermal and olfactory sensations in addition the the visual and auditory outputs of the common screen.


It would extend the virtual world even more and would expand boundaries between the real and virtual worlds. Users can be engaged more through giving feedback to more senses.


There would be a limit to the variety of sensations that the device would be able to reproduce, and it is unclear whether it would be effective or useful at all.


This would be very interesting if developed for virtual or augmented reality. There is much to explore on the possibilities of creating a very real virtual world. However, the interface must be developed to be more than just a screen/tablet interface. How do we replicate sensations and movements?


While on the topic of senses, I came across this book called Designing Across Senses by John Alderman and Christine W. Park, whose first chapter is available on Safari.

The senses are the (only) ways that we have of experiencing the world. Understanding how they work is key to designing new interfaces. They can also be extended by technology, with sensors that can go places, stay alert, and perceive things that we can’t. These design methodologies expand upon existing practices and introduce some new ones. Both the human capacity and device capability for multimodal combinations and activities is near limitless.

It opened my eyes to the fact that we experience the world through our senses. The information received from our senses is then processed and meaning is extracted from it to take action and make decisions. Every experience is multimodal: understood/processed through a combination of different senses.


Devices also have modes. They are usually created based on specific types of physical information. Other than visual interfaces, speech, touch, olfactory, haptic and gestural interfaces are now very commonly found in many devices.

Many different kinds of human behaviour can now be used as input and be processed into various kinds of outputs.

Much of interaction design relies on how each sense works individually and on abilities that emerge from how we integrate them together. Mapping sensory modalities to interface modes correctly can mean the difference between a cohesive experience and a disjointed one.

After reading this chapter, I realised how in design, it is important to revolve around the human senses as it is what we experience the world through, and that by engaging different senses in different ways, there is much we can do to explore the human-device interaction, or even human-human interaction through devices.

DOW – Health: Lechal by Ducere Technologies

Lechal is an interactive haptic footwear developed by India-based Ducere Technologies. Shaped like an insole, users are able to use Lechal with many different kinds of footwear. It provides GPS navigation through vibratory feedback that would guide users invisibly, but intuitively.

Through Lechal, Ducere Technologies strives to help the blind or visually impaired navigate the world better. Lechal uses Bluetooth technology to sync to a user’s phone, where you can set your destination. It will then show you the way through gentle vibrations, which means that the user does not have to be cosntantly relying on visual or audio feedback from their phones/devices. This also makes it easier for the visually impaired to lessen dependency on their walking sticks.


Unlike map applications, which would require users to look at their devices, or through audible instructions, Lechal directs users in a hands-free way. Instead of having to walk with their sticks, Lechal could direct the visually impaired in a whole different way.

Other than the visually impaired, Lechal also works great for those with perfect vision. Instead of having to spend days with their heads down, or ears cocked to listen to their phone’s instructions, users are able to look up and enjoy their world without being too dependent on their phones.

Lechal also works offline. Without data connectivity, it is still able to give users directions anywhere in the world.

Other than navigation, it also has other functions that track workouts and fitness goals.


I’m not sure how it would be if users missed the vibration, or if its affordance is as effective.

Besides, other than a navigation device and health-tracking device, I don’t know if it completely eliminates the need to use walking sticks for the blind or visually impaired.


Maybe other than its main functions, Lechal could also be developed to have distance, motion and audio sensors that would allow it to detect its surroundings and direct its users better. This way, the blind or visually impaired can navigate the world better, without having to rely on sticks. Plus, it already has GPS so it would also be able to replace map applications for the visually impaired.