Third-I: Final Post

Updates and process

After the last update, I bought the materials that I needed. Few days later, I got the opportunity to work with Galina to create the headpiece which I am really thankful for as it looked good and I wouldn’t have been able to make it if it weren’t for Galina! Thank you!

Quick prototyping with cloth

Moving to Neoprene
Final result!

Testing:

Adding eye tracking:

The eye tracking works well, but it is unstable and needs to be constantly pressed down by my hand.

After that, I kind of left the project as it is as I had to work on other projects. The lack of time really killed me here as I have overpromised the device, even though I know that they can be done. They just cannot be done within the timeframe given.

There wasn’t any new updates to the device as, conceptually, it is already perfect to me.

The Code

There are 3 codes:

1. Eye: The eye is an ESP 8266 wifi module connected with 2 3.6V lithium Ion batteries. This gives a total voltage of 7.2V that powers the servo motor, the 2 IR sensors, and the board. I think this was still too little voltage (or there was some complications) as it wasn’t making it work very well.

 

 

Auto detection mode: the eye scans for the nearest object within its range and tracks it

Auto tracking mode basically says, if one IR sensor is detecting a lower distance, then the servo will turn to that direction.

Sweep mode: motor goes left, then right, and repeat

Sweep mode basically loops the servo left and right.

Eye tracking mode: the wearer’s eye controls the movement of the eye.

Eye tracking mode (code incomplete) is basically reacting to the value sent when the participant looks left or right.

2. Head (wifi): The wifi module on the head receives information from the eye and sends to the main arduino via Wire (I2C communication).

These lines of code basically sends the state data to the eye via Wifi, using Adafruit.io (this does not work yet due to lack of time)

This is for receiving the info from the eye, which is transmitted to the main Arduino via Wire.

3. The head contains the eye tracker, vibration modules, and buzzer. The code is simple, if it receives a certain value, it will buzz a specific vibration module and ring a specific tune. Unfortunately, I don’t know why, all the vibration modules run at the same time. I had no time to troubleshoot.

Final Device

The device, as it stands now, is only a prototype. It is sad that I can’t get it to work on time. Although, I am still proud that I got the eye working, which is really cool as it really works wirelessly.

Further Improvements

  1. I will continue working on this project over the holidays to finish it, so I can be proud owning this device and have it in my portfolio.
  2. I would 3D print the eye part so it looks more polished and I can hide all the wires.
  3. Fix the eye as it keeps turning left for no reason (it keeps happening even at the beginning stage of testing)
  4. Fix the vibration and include the buzzer as I haven’t really tried using it.
  5. Try out having the eye tracking up, as that should honestly be the main part of this device, next to the movable third eye.
  6. Hide all the wires and make this thing looks cool
  7. If the auto eye movement still doesn’t work, I’ll just stick to eye tracking as it seems to be the one that makes the most sense for this concept for now.

Lessons and Reflections

I have really over-promised everything and that’s really bad. I’m sorry to disappoint everyone with my failure. The important lesson is to be less optimistic when it comes to coding. Many of my ambition comes from optimism that things will all go according to plan. Mostly, it doesn’t. Also, the worst part of coding is to make everything run together smoothly. When there’s a lot of parts, it becomes very complex and hard to handle.

Despite this, I really learnt a lot. I learnt many aspects to coding that make my project work. I will try to bring this knowledge further next time. After this, I’m keen to learn more about other ways to code which isn’t as clunky as it is in this project.

I also learnt that we are designers and not engineers, so my project should be more on the experience than the tech. I was too focused on the tech and trying to combine everything that I was overwhelmed. I should really take things one at a time.


Updates:

So I simplified everything so everything is connected by wires now. The flaw is that there is now an ugly hose that’s connecting everything together. Still, I’m proud of this!!! And this is still just a prototype and I’m sure there’s more to improve for this project. I hope I can continue this somehow through other modules or FYP… As I’m thinking of doing something of similar theme!!!

Switch

Jeff taking a look!

below are videos of it working:

Auto tracking:

Eye tracking:

Pitch Proposal : THIRD-I

https://oss.adm.ntu.edu.sg/a150133/category/17s1-dm3005-tut-g01/

https://unnecessaryinventions.com

 

Updates:

Design:

Tests:

The Interaction

  • Tracking mode
    • IR sensors will detect difference in distance
    • Motor will turn to track the object, prioritising whatever’s closest that’s in field of view
    • The info of motor angle, IR distance will be sent wirelessly to Adafruit.io
    • These info will be received by the cowl, vibrating the corresponding vibrating motor to indicate sensor’s location
    • * A pitch will be heard on either side of the ear to determine proximity
  • * Sweep Mode
    • IR sensors will detect its surrounding like a radar
    • Motor will turn 180 degrees to and fro.
    • The info of motor angle, IR distance will be sent wirelessly to Adafruit.io
    • These info will be received by the cowl, vibrating the corresponding vibrating motor to indicate sensor’s location
    • * A pitch will be heard on either side of the ear to determine proximity
  • Eye movement Mode
    • Motor will turn in the direction of the eye, stopping when the eye is looking forward.
    • IR sensor detects proximity
    • The info of motor angle, IR distance will be sent wirelessly to Adafruit.io
    • These info will be received by the cowl, vibrating the corresponding vibrating motor to indicate sensor’s location
    • * A pitch will be heard on either side of the ear to determine proximity

Communication flow (1 cycle)

  • Cowl’s Arduino sends mode info to 8266
  • Cowl’s 8266 sends mode info to eye’s 8266
  • Eye ESP8266 subscribes mode info
  • ESP8266 sends the info to its slave lilypad which acts accordingly
  • Lilypad receives motor and IR distance info and sends to eye’s ESP8266
  • Eye ESP8266 publishes motor and IR distance info wirelessly
  • Cowl’s 8266 reads the info and sends to its slave Arduino
  • Arduino react its vibrating motor and pitch accordingly

What’s next?

  1. Receive ezbuy stuffs so can continue
  2. Settle ESP8266 communication
  3. Settle master-slave communication
  4. Get necessary materials for new eye that’s bigger in size
  5. BUILD THEM! And simplify the difficult parts