IDev | Semester Project: Low Fidelity Prototype

After presentation, feedback and approval of my revised Semester Project Pitch, I proceeded to carry out Phase 1: Low-Fidelity Prototyping for 20th October (Week 10).

Here is the planned production schedule:

0. Milestone for Low-Fidelity Prototype (Arduino interaction + material light test):


Below is the documentation of how I arrived at the completed milestone (above) for Low-Fidelity Prototyping.

1. Purchasing of materials for low-fidelity prototype
I went to SGBotic to purchase the components needed for the first phase of the project. At the same time, I also online ordered the clear PLA material needed for 3D printing to house my finished product.

2. Establishing Master-Slave I2C on Arduino Uno
I thought it would be difficult or required a ton of coding to get the Arduino to communicate with each other. So I started my prototyping with this crucial component first to make sure it is up and running well.

As it was my first time using such a connection, I was slightly confused at the connection or how to establish it correctly. Through consultation, I learned that the code portion was minimal, and the connection was more of what Arduino board (Master or Slave) the code was transmitting to.

Here is the basic Master/Slave I2C code on Arduino Uno:

3. Controlling LED light with 1 switch button on I2C Arduino Uno

After establishing and understanding what’s going on in an I2C connection, I tried to attempt a basic control of 1 switch button controlling 1 LED on Master + 1 LED on Slave.

Here is the final outcome of the try-out:

Through this code, I’ve learned even more of what goes into an I2C. I felt it was a good decision to try and test the communication out with a simple circuit first to fully understand what’s going on before piling on more inputs and components.

Here is the code for the above outcome:

4. Controlling LED light with 1 Touch sensor on I2C Arduino Uno
This step is similar to the previous one, but I replaced the switch button with a Touch sensor as it was an interaction input I intended for the final outcome as well. This is to double-check if the touch-to-light-up-LED interaction was working out.

Here is the final outcome of the try-out:

Since the touch sensor input works similarly to the switch button, there was minimal changes to the code.

Here is the code snippet with changes made for the above outcome Master (L) / Slave (R):

5. Controlling LED lights with 1 Touch sensor, 3 switch buttons on I2C Arduino Uno

This step combines whatever I have tried out above. As everything works well individually, I was sure that something might go wrong or conflict when the codes are compiled together. I was right, and below is the documentation of rectifying it.

Problem identified: the message from Master only sends one “If ___ is activated” state. The connection only sends 1 byte, instead of reading 4 different states (for 4 possible inputs). The Slave also only reads 1 condition, “If ___ is activated, light up LED”. As no differentiation or allocation has been set to detect which button activates which LED, thus if any button is pushed, it always activates more than 1 LEDs and not solely the corresponding LED.

Here’s the initial (unsuccessful) interaction process:

Here’s the initial (unsuccessful) code for the above outcome:
In the code above on the Left, the Master code sends 1 byte to Slave (highlighted). By changing it to 4, it is able to read and send the states of all 4 inputs at every loop. On the Slave code on the Right, it currently only reads for 1 “is activated” state to trigger all LEDs. Instead, each LED should have its own “is activated” state.

After identifying the issue and consultation in class, here is the successful and rectified outcome:

Here is the rectified code:

The changes made above are increasing bytes to 4 on the Master. Each input state is also specified and changed into a boolean to trigger the on/off of its corresponding LED on the Master. The same is specified on the Slave.

6. Materials testing
Apart from code, I’ve also explored the materiality for the device that is user-facing/houses my components. As I will be building my forms using a 3D printer, I explored a clear and opaque white PLA material to see which diffuses/illuminates a single LED light better/more convincingly.

Here’s a quick look at how the 3D printing process is done:

Here is a side-by-side comparison of the visual differences between the 2 materials:

Below is a test using opaque white PLA material:

Note: Variance in luminance is due to Video taken in Day, Image taken at Night.

The opaque white material could produce distinct features of the face BUT the luminance was minimal and not as diffused. Instead, it was concentrated and it comically resembled a medical chart / body ache-zone model.

Here is a test using clear PLA material:
The clear material could produce very bright luminance and diffused, even light distribution BUT the features of the model were not distinct.

In both materials, the light concept essentially could be replicated as intended. It would now be a matter of preference and which pros/cons align with the concept better.

//End of Low-Fidelity Prototyping documentation

Leave a Reply