Light Distance Away: Final Presentation[Yr3| InteractiveDev| Final Project]

Project done by: EmmaWei LinNatalieWan Hui

Slides Here

Context

In our current pandemic situation where loved ones cannot easily be with each other physically, our device (and setup) aims to stimulate your loved one’s walking movements in your room. It seeks to highlight distant bodies through the absence of the body. 

How the device works

Walking around the space, one’s footsteps are translated into a path of light in a distant room. In both rooms, a path generated by your loved one begins to form and a pair of shoes follow suit.  Watching the shoes move, the distant movement is transported by light to the room one is currently present in. The presence of another body either haunts or comforts us through its absence represented by the shoes. The physical distance between is shortened to a lighted path as both distant bodies now move together in their respective rooms.

Light Distance Away device (footwear)

Light Distance Away (footwear)

DEMO

Continue reading “Light Distance Away: Final Presentation[Yr3| InteractiveDev| Final Project]”

Light Distance Away: Mid-Fidelity Prototype [Yr3| InteractiveDev| Final Project]

This is our progress for our Semester Project, Light Distance Away.

Concept Sketch
Concept Sketch
Light Distance Away (footwear)
Light Distance Away (footwear)

Our goals for mid-fidelity stage 

    • Prototype 
      • Track light and draw lines and curve on processing
      • Project the line on the floor 
      • Stable LDR tracking system on motors that follows the lines projected
      • Setting up the area with pixy camera, arduino & projected at the ceiling

Following our low fidelity prototype, we needed to research on how our motor could follow the XY coordinates received. After some consideration, we thought about using light as our method of moving the motors of the automated footwear.

Luckily for us, we already had spent time creating a line in processing which had initially only been there for us to ensure our pathway was tracked. Yet, knowing light could be a viable option, we began to borrow top down projectors and then flipped our code’s background and line colour to display a light line that would draw as one walked around under the pixy camera. 

Thus, we began our prototypes and trials, but not before ensuring our pixy cameras were improved.

Continue reading “Light Distance Away: Mid-Fidelity Prototype [Yr3| InteractiveDev| Final Project]”

Light Distance Away: Low Fidelity Prototype [Yr3| InteractiveDev| Final Project]

Project done by: EmmaWei LinNatalieWan Hui

Our semester project is called Light Distance Away. It aims to send comfort by the hint of another person’s presence despite them being absent through the use of our setup and device. A camera will track User 1’s movements when they are within the display area and a white line will be projected into User 2’s room based on User 1’s movements. User 2 will then be able to see the personless footwear moving based on the projected line in realtime, ironically creating a rather haunting feeling of an absent presence, while comforting the user with the reminder of this presence. The physical distance between is shortened to a lighted path as both distant bodies now move together in their respective rooms.

We started with 2 separate development paths after researching in depth of possible tracking methods: 

    1. Camera to track object 
    2. Motor to follow path

Our goals for low-fidelity stage: 

    • Research
      • How to track person’s movement 
        • Indoor Positioning System
          • Image tracking
          • Bluetooth
          • Heatmap
        • Color
        • Computer Learning
        • Gyro and acceleration
        • Ultrasonic sensors
      • Possible softwares to use
        • Processing + Open CV 
        • Arduino + Pixycam 
      • What kind of hardware to control motor
        • DC Motors + Motor Driver Shields
        • Consider dry hacking remote control cars
      • How to send tracked data to the motors?
        • Adafruit.IO
        • Projecting out the tracked data?
    • Prototype 
      • Stable Tracking system which collects x and y coordinates, then maps them out in processing 
      • Dirty prototype car that moves 

Research Process for tracking:

Indoor Positioning System

Researching existing indoor positioning systems led us to understand that the cost of doing such a system had been high, with many high accuracy models costing from hundred to thousands of dollars. 

Considering the price points, there had been low chances of us using this highly expensive system as we could not afford to do so. 

System models such as bluetooth and image tracking involved either bluetooth beacons or high quality cameras with an in-depth machine learning code. 

Thus we decided to forgo this idea and move on.

Gyro and Accelerometers

Gyroscopes and Accelerometers had proven to be rather accurate and suited what we had hoped to achieve, yet, there had been complex mathematics involved in this code that few online were open to share. 

After several attempts of trying to do differentiation and integration of physics equations, we found ourselves at a loss to combine the two into an efficient way to solve our problem. 

It had been a shame to be unable to effectively use this method since it had been used and said to work well, however, we had limited time and were unable to spend hours figuring out 1% of a long process towards our goal. 

Given more time, we would definitely aim to attempt this method once more.

Ultrasonic Sensors

Ultrasonic sensors initially seemed to be a good idea, yet, we soon realised that there had been faults with the system, such as the need for zero blockage towards the sensors from the object to be tracked, and the object also had to be the same level as the sensor at all times. 

In our case where we would have people lifting their feet off the ground as they walked, this system would have high chances of failure. Hence, we decided against this method.

Colour Tracking

Finally, we had thought about colour tracing, realising that instead of having a sensor on a person or using deep machine learning, we could use a simple colour tracker program to track a colour we have tagged to a user. 

One thing that immediately came to mind was the Pixy camera, which by default came with a colour tracking system. As we considered this method further, we started to search for multiple ways to do colour tracking, with Pixy as our safety net. 

Deeming this the most suitable and efficient method, we started our prototypes.

Research Process for motor:

DC Motors + Motor Driver Shields

This was the more direct method because we can change the rotation of the motors easily with code.

 

Dry hacking remote control cars

For this method, it was slightly more complicated because we needed to dismantle the toy cars to figure out how they worked internally. The challenging part about this method is that different toy cars would have slightly different internal components, but in order to keep it consistent, we had to find 2 sets of identical cars, which we did not have access to.

Prototype Trials:

Camera Tracking

Pixycam with Arduino 

The Pixycam is attached to an Arduino Uno board. Data is sent to the arduino through the serial monitor, with the controls to the Pixycam settings in Pixycam’s own software, Pixymon. Colour tracking was rather jittery in the beginning, however we teaked the settings in Pixymon to ensure that color detection was as stable as it could be. Some changes to the settings we made included: 

  • Changing camera brightness to accommodate our dark room. If camera brightness is not increased, then whatever colour we detect will be too dark and risk blending into the surroundings 
  • Changing max blocks and max blocks per signature to 1, so that only one light will be detected at one time
  • Changing min block area to one so that we do not need a huge source to be able to track users 

While it had rather good colour tracking, we realised the camera quality was pretty bad, having a size of 319 x 199 pixels only. As such, we decided to test out another camera, and use processing along with it. 

Open Source Computer Vision Library (OpenCV) in Processing

Using Logitech c930e web camera in Processing to track a specific colour. Coded it so that when specified colour is detected in frame, it will form a rectangular area (blob) and only the area with the biggest colour will be registered. Compared to Pixycam, the webcam quality is much clearer and colour detection is more stable/less jittery. However, we realised that having a better image quality might not be necessary in our project. The main thing we needed was actually which camera would be able to track better from a further distance as our camera will be eventually mounted from above, so it needs to be some distance away. 

As such, we decided to test the two side by side. We found out that despite having a blurry camera quality compared to the webcam, this did not affect the color tracking. Pixycam was actually able to track the color from a further distance compared to Processing. Additionally, setting the color we wished to track was much more convenient for the Pixycam. Pixymon allows for users to quickly set color signature by taking a screenshot of whats on the camera screen, then allowing users to drag and select the area they which to detect. Conversely in Processing, we would need to manually change the (R,G,B) values of the color we want to detect. This means that it would be much harder to accurately track our intended colour if we did not input the correct/similar RGB values. As such, a range of values from the same colour could be detected in Processing, but not for Pixymon, which we were trying to avoid. 

Lastly, in our Processing code, we were able to get the size for the area of the colour detected, but had trouble trying to code so that we could extract the centre coordinates of the detected area. With the pixycam, it was very straightforward to extract the x and y coordinates and to send that info to Processing. 

 

Camera Conclusion (Lo Fidelity)

Realizing that the pixy cam was much more stable and convenient for what we needed, we decided to stick to it. And then continued on with trying to connect it to arduino. 

To begin, we decided to track one’s position, we would need to use the x and y values of the user. Thus, the code was written to send values to Processing from the Pixycam via Serial Port in Arduino.

 

Process of Arduino to Processing for Camera

  • Arduino receives X and Y coordinates from pixy
  • Sends the X and Y coordinates to processing 
  • Using CurVertex, a Vector line is drawn

While we initially started with a  straight line, we then continued to try and change the line into a smooth curve using curve vertex. Following this, we tried to make the lines as smooth and clean as possible. This was done by trying to ensure the new points added to the curve vertex were a good distance away to prevent accumulation of points, and also cleaning our older points so the screen would not be cluttered. 

Motor

Conclusion to hardware for moving shoes: We did try to get cheap RC cars online, but the delivery took some time and when it arrived, we still had to spend time to figure out the hardware for it. We ended up using the DC motors because we already had a few DC motors available, and we could start with the Arduino programming right away.

DEMO: Motors moving in opposite directions & Motor with wheels

With the DC motors we had on hand, we first tested it out to make sure that it worked. For this, we did a simple code that controlled both motors based on keypress (using WASD layout). So for the shoes to turn left, we made the left motor stop and the right motor rotate in the clockwise direction, and vice versa for the shoes to go right (i.e. right motor stop, left motor rotate in clockwise direction).

 

Links to Documentation

Project Proposal

Low-Fidelity Prototype

Mid-High Fidelity Prototype

Final Presentation

 

Research references:

Interactive Devices

Indoor Positioning System

LEDRoom ColorEmo + Rainbow Switcher [Yr3| InteractiveDev-Sketch 2]

Brief

Creating an interactive work that involves communication between ZigSim on mobile, Arduino and Processing on desktop. Making gestures with the phone that will be translated to the room lighting. This is an individual work by Emma 🙂
Developing from the previous sketches (link below), I have developed two different set of programs to show ways to change the LED strip color.

https://oss.adm.ntu.edu.sg/ycheuk001/ledroom-yr3-interactivedev-sketch-2-Process/

1. ColorEmo

      1. Concept:

      2. How do you feel now? Are you feeling happy? Draw a happy smile to brighten up your room in RED light!  Most of the time, people tend to hide their feelings. Instead of showing your sad face, you could draw a sad face(a curved line facing downwards) to represent “I am feeling blue” with blue lighting . To express surprise, you could draw a O to u light up the room in Yellow!

        Instruction:

      3. Draw a smile to change to red- happy 😊
      4. Draw a sad face to change to blue- I am feeling blue 🙁
      5. Draw a O to represent surprised – 😮

Continue reading “LEDRoom ColorEmo + Rainbow Switcher [Yr3| InteractiveDev-Sketch 2]”

Distancing Chairs [Yr3| InteractiveDev-Sketch 1]

Project Brief

  • Make a low-tech device that enables / enforces / suggests / alerts / …about social distancing.
  • Analog – it achieves distancing by construction or by the way it is operated.

Research & Ideation

Previous research/ brainstorming/ ideation : (wearable devices)

https://oss.adm.ntu.edu.sg/ycheuk001/social-distancing-ideation/

Developing something other than the wearable devices

Due to the Covid-19 situation, people are required to maintain a safe social distancing of at least 1 meter. Moving into post-circuit breaker phrase 2, most activities resumed, yet there is group size and capacity limits in place. People are not allowed to eat out at restaurant together(with capacity limits) to prevent saliva spreading.

Yet, in real life, most people will just ignore safety measures. To practice the social distancing in a fun way, I am suggesting the game of “Distancing Chair”. 

The Distancing Chairs Demo video

Distancing Chairs – Play it everywhere/anywhere 

RECAP OF THE CONCEPT

A rope is tied at two chairs which is placed at 2 opposite sides of a table. 

When a person wants to sit on the chair, and pull the chair back, the chair on the other side will be pulled by the rope towards the table. No space will be left for the other person to sit on that chair.

This is to suggest the idea of “One Table One Seat”. No more than 1 person is allowed to sit at a table.  

(inspired by the Musical Chairs Game)

Continue reading “Distancing Chairs [Yr3| InteractiveDev-Sketch 1]”

Arduino Interactive Device: Flick to Win [ Yr2 | IM II ]

FLICK TO WIN: SHOWCASE

Collaboration with  Chai Mei ShanEmma CheukHow Yee TengLoh Wan Hui

FLICK TO WIN

Flick to Win is an interactive device developed with Arduino software and following hardwares:

– Wood panels
– Acrylic (2mm & 3mm thickness)
– 25x 10mm LEDs
– 25x Toggle Switches
– 3x Servo Motors
– Arduino Mega

IDEA (Assigned Theme: Obsession)

We want to show people’s obsession with winning when there is an incentive to win (in this case, the money). However, our machine is wired in a way such that it isn’t easy to win, so this would spur people to play continuously, to emphasise on the part of “obsession”.

Continue reading “Arduino Interactive Device: Flick to Win [ Yr2 | IM II ]”

The Freebie. a free kiss machine [Y1: Experimental Interaction Final Project]

💋 a collaboration with 

The Freebie is a provocative object that aims to bring the awareness of free stuff. Everyone loves free stuff however without people realising, free stuff is not technically free because people somehow have to pay for a price to receive them. As the word “free” is the best way to capture people’s attention, we decide to use it in our product.

“Nothing in life is free”

“kiss is”

This product gives away free unlimited kisses but the people who are receiving these kisses have to pay by embarrassing themselves in public, sharing their kiss experience with others. 

Concept: A kissing sticker from WeChat, Copyright: suano

The Freebie Video

Continue reading “The Freebie. a free kiss machine [Y1: Experimental Interaction Final Project]”

Disobedient Objects [Y1: EI. Micro-Project 4]

SHAKING UP: Trailer 

Initial Sketches

Introduction

Inspiration

Final design

Documentation

 

Trying On

Self Reflection

How does your hacked object behaves in a way you least expect it to?

What are some reactions you observed from your participants when they interacted with the object?

What are the challenges involved and how did you overcome them? What problems still exist? How might you overcome them eventually?

 

EMMA

  1. Expectation

Our hacked object, the “vibrating bracelet” works exactly like I expect it to behave, vibrates after a certain period of time. At first, my partner Hamimah and I were thinking whether we should add a time display, just like this. After discussing, we decided not to because it would distract the audiences. If we had added the time display, our product would look like a normal watch, with a vibrating function. Therefore, we sticked on the idea of just having a vibration motor.

During class, my classmates had further expanded the discussion: to have a watch surface or not. Somebody said yes, so they can know the time. I personally think that not having the watch surface makes the product works better. I will go back to nap for a while if I know the timing. The strong vibrating is enough to “shake” me up.

  1. Reactions

Our 1st participant was confused about how to interact with our object. She tried on and noticed the object is vibrating. The 2nd and 3d participants observed the 1st participant and got the idea of the function of our object(vibrates then stops, vibrates and stops). Overall, participants didn’t get the meaning of our object after experiencing. Nevertheless, after explaining and presenting our object, people gave positive feedbacks saying that it really works as a “Shaking Up” object. People became even more interested in our object when they were told the strength of vibration can be adjusted.

  1. Challenges

When we first came out with this idea, we were very lost as we haven’t learnt how to use a motor. We struggled a lot of from what electronics and electrical needed to how to type for the coding nor how to setup the arduino with the motherboard. Choosing Material for attaching the vibrating motor: a bracelet, an elastic band, etc. We chose a watch strap as it is lightweight and comfortable to wear on, perfect for wearing when sleeping. We thought of placing the vibrating motor on top of the bracelet but instead we placed it at the back so the vibration to user was stronger.

The questions for consulting:

  1. How to connect the timer with the vibration motor?
  2. What is const int motorPin? motorPin=3? (we are confuse)
  3. What does it mean for the speed of motor to be 0 to 255?
  4. How do we control the strength of the vibration?

The problems were mostly solved during the consultation. I am pleasant to have a slot for consulting our lecturer. The ‘sewing department’ helped us to lengthen the wire of the vibrating motor and Lei helped us to find a sample together showing how to place the electronics. We learnt that a 1 Ω resistor isn’t same as the 5 Ω one and how to control the strength of the vibrating motor (value 0-255, the larger the value, the higher of strength). Also, learning that Arduino Pin3 is used for motors.

1 hour before the Trying On lesson begins, we still couldn’t solve the coding problem. We wanted it to vibrate for a few times then stops, just like how an alarm clock functions. We thought of doing a countdown code but it was too hard for us to construct a workable coding. We ended up using the original researched coding sample.(changing the delay time) We copied and placed some codes but there were errors of the codes. Finally, we solved it by double-checking the connection to the right port and decreasing the times that we wanted it to vibrates(from 5 times to 3 times).

  1. Reflection

The experiencing could be bettered by including a sensor or more output devices, for example, a LED light, pressure sensor, a buzzor. They can create a more ‘interactive’ experience to the participants. I will say we both learn a lot from this Micro-project, about collaborating with a partner, constructing a interactive product, knowing the materials and electronics needed for creating an Arduino working object. I hope one day I can actually write and put the electronics in the right place without coping codes and following the sample.  

 

HAMI

Both of us understood the brief differently and because of that there wasn’t any unexpected behaviour. The whole layout was set to achieve the expected results. Since our final outcome was a little different from the rest, our first participant was extremely clueless on how to interact with our object. She keep on asking questions on how to approach the object which ended up with one of us have to step in to help her. The other two participants was quite alright since they have the basic knowledge from the first participant. The overall feedbacks we get was that everyone was at loss with our object. However, after our explanation, everyone seems to be aware of what we are trying to approach. There was also a moment of chaos when they learn that they are able to control the strength of the vibration. Overall, the reaction we get was that a few of the class thinks this vibrating bracelet was a cool idea however, there was a handful who thought that this was a useless idea which is quite sad. The whole process was a challenge me and Emma since both of us was completely loss throughout the whole making process. We were lucky that we get a slot for the consultation. The consultation helps a lot. Like seriously a lot. The challenges we faces is 1. Controlling the strength of the vibrating motor   . 2. The parts we need for the vibrating motor to work . 3. Getting the right coding for the alarm. The first and second challenges we overcome with Lei helps. The discovery of controlling the strength of the vibrating motor helps most the confusion. But sadly, we still haven’t solve the alarm coding. Our lack of understanding with the code ( with / for ) make it impossible for the alarm to work. We decided to play around with the timer coding (it is the easier way to deal with time) by creating longer delay. The existing problem is perhaps that we are making an alarm object without using the alarm coding. I guess the only way to overcome this situation is when we learn how to make interactions with time.

Comparing Micro Project 4 with 1, 2 and 3, Micro Project 4 is way beyond difficult but I think mostly it’s our fault that we were quite ambitious to try something that wasn’t taught before. Perhaps we should try dealing with LED light and sound sensor. Vibrating sensor was quite a challenge to deal with firstly the different parts that we never heard off before and secondly it deals with time. We learn a lot of things in this project but the main takeaway is that interactions that deals with time is … let’s just say difficult is an understatement. Till today we are still confuse about the whole coding.