For my Interactive Devices project, I set out to make a little buddy that faces people when they come into proximity.

Initially the idea was a fibre optics tree so that the visuals would be interesting when it spins around.

But I realized that I was kind of making a turret, and looked into the various forms that I could make for it’s turret form.

Form aside, I also contemplated between IR sensors and Ultrasonic sensors, of which I settled for ultrasonic sensors because I found a (much) cheaper deal on them.

This project was an immense learning experience with my two main electronic components, the ultrasonic sensors and servos.

As stated at the end of the sem presentations, and as I and the rest of the class probably learnt along this project, is that nothing is ever clean, digitally speaking.

Bulk of my problems were trying to get “proper” readings from my ultrasonic sensors.

I started with using the NewPing library for ultrasonic sensors in Arduino, along with using their base code in their examples for multiple ultrasonic sensors in an array.

The above gif is my first success of having the ultrasonic sensors running and the servo running my “noBody();” function. The function which runs when the sensors don’t detect anyone. But this incident were two separate events running, where the sensors were not affecting the servo.

An issue that popped up at this stage that I didn’t figure out for a while, was that the servo turning blocked my ultrasonic sensors from taking a reading.

And with the readings not being clean, there odd numbers were especially prevalent at this stage. Very VERY often, the sensors would read at very odd and erratic numbers, sometimes reading 0  (depending on the code). I borrowed everything from an iCubeX to an arduino mega to try my device on but to no avail.

This was really a big problem I had with the interaction as these odd reading would trigger things that I didn’t need being triggered.

Eventually I whittled down the problems to the delay. Delay was the thing that stopped my device from moving along smoothly AND was making my reading odd for some reason I still don’t fully understand. I came to this conclusion by first reducing the delay down from 500 to 33, and this was a big breakthrough because the reading were suddenly much smoother, at which point I took it out completely and it was like magic with how smooth the reading were at first.

M A G I K S

At this stage it was working decently, but this was not to last, subsequent addition to the code seemed to have broke it once again.

Servos were also another big problem, with my bigger (INSERT MODEL NUMBER) servo, it wasn’t working very well, and I was very afraid that I had a power issue on hand; or that I had a faulty servo.

Fortunately, I bought two! Neither of which worked. This was also another big issue and learning point for me in the project. I scoured the web and eventually found a way to test these servos.

I discovered that both the servos could spin continuously, not something that I had expected, but the problem with these servos was that they would go off centre very easily and then not be able to spin again. Causing me to be unable to control them in a precise manner that I needed.

Fortunately, (a constant word with these projects) I was able to borrow servos from a fellow classmate.

But the issues of reading were now getting very tricky, with most of the hardware settled, the code was still somehow not working.

I scrapped my initial base code and rewrote it without using the array, that way I could more easily call to the individual sensors in a way that I understood easier than with them in an array.

My next fix that came about was using DC to power the Arduino, that actually made quite a big difference in my ultrasonic sensor readings and seemed to have made it run smoothly again.

The next step was putting it together, and once again, something wasn’t quite right, and still this portion is a bit of a mystery, but it sorted itself out.

While my final form was quite janky, I was pleasantly surprised with how well it worked.

The project brought with it quite a number of learning points.

Helping me fine tune my troubleshooting abilities, working within the limitations of the “dirty” nature of electronics and it’s sensitivity to interference (and power supply).

And with that, draws the end of DM3005. Thank you very much~

(I’m quite pleased with the video, especially with that prime moment at 0:38)

 

 

For the semester project, i am planning to make a tree that looks at people that approach it.

I’ll intend to use the i-cube X with small geared motor to facilitate the free rotation of the tree.

Optical sensors or Distance sensors which have an effective range of 80cm should be adequate to detect nearby people.

The diagram below is a simplified version of the device.

I’m currently having second thoughts on the fibre optics, as im not sure of the benefits of adding them and they currently feel like a very arbitrary complexity to the project and appear to be more of an aesthetic aspect.

Lastly I’ve considered adding auditory feedback as it adds both user feedback. The feedback is for when the device is in its different states of function; searching for people, finding a target and being locked on and facing a target.

The Surface Dial is an additional tool that works primarily with the Microsoft Surface Studio PC.

The dial is a minimalist knob that can turn freely to provide the user with a myriad of functionality, from tilting the canvas, switching between tools, and even changing colours on the fly; it can even adjust the volume (wow!). The dial can be clicked like a big button and also has haptic feedback.

sorry, but it doesn’t make the back of the pc translucent

Placing the Surface Dial on the MS Surface PC brings up a variety of different utility around the dial itself. Couple with touchscreen capabilities and the pen for use on the screen of the Surface PC itself provides users with a very smooth and innate ability to bring their concepts and designs to the digital platform without feeling like technical know-how of software being in the way.

The Surface Dials shows just how intuitive alternate modes of control can be, as opposed to shortcuts or functions stuck behind layers of drop-down menus. It also brings to mind how the current tools of mouse and keyboard can limiting user experience and in turn possibly shape software controls to be cumbersome than need be. It also doesn’t hurt that the graphics and animation around the dial are stunning and pleasing to the eye.

The Surface Dial however, isn’t the first of it’s kind. Previously there was the Griffin PowerMate, from as early as 2002. On Kickstarter there’s even a ‘Rev-O-mate’ from Japan priced at USD$75 as opposed to the Surface Dial’s USD$99, not including the price for the Surface PC setup which goes into the thousands (but it can still be used with regular computers without the on-screen functionality).

The PowerMate and Rev-O-mate might not be the stunner in conjunction with the Surface PC, but they hold their own for artists looking for cheaper alternatives for more intuitive control for color-correction, audio engineering and digital painting. The Surface Pro however, takes things a step further with it’s abilities and additional visual information and ease of use when placed upon the Surface PC itself, making something that’s actually been around for awhile, break new ground.

Hearing is something that once lost, is neigh impossible to recover fully; which is why it’s so important to preserve our ability to hear as much as we can. With concerts on the rise in Singapore and music and loud sounds being part of any event, people often end up unprepared in these high volume situations without any protection for their ears.

The Here One’s are a pair of ear-buds that can help with that, amongst a myriad of other things it can do. These ear-buds, besides listening to music, can augment the sound from around you in real-time. This is handy for music events where the user would like to protect their hearing and still enjoy the music un-muffled as cheap ear buds may do.

The Here One’s are controlled through a phone app, where you can fine tune a whole multitude of variables. You can EQ your music and the world around you, block out specific frequencies (eg. if you want to tone down a certain instrument in the band, you can alter the frequencies that the instrument lies in), you can even add effects to your world for fun, like reverb, distortion and even flange the audio.

The downsides however, are it’s battery life, 2 hours. It comes with a carrying case that doubles as a charger with a charge time of an hour. Some would argue that situations where you’d use the earbuds wouldn’t be much longer than 2 hours usually and that a larger battery would only contribute to a longer charge time as well. Which brings to mind an important portion of making devices, where a conscious decision has to be made to strike a balance in issues with usability.

There are also limits to it’s control over real world audio. If the volume is too low and the seal between the buds and the ear canal is too loose then sound leaks back in past the buds. This however can also be solved with appropriately sized buds/custom fit buds and perhaps tweaking noise cancelling software.

Lastly, the Hear Ones aren’t the only ones in the market, the iQbuds are an often compared competitor and of course there are pros and cons to both. I think either are a suitable purchase for any hearing protection/augmentation needs in this increasingly loud world.

For people plagued with disorders such as Parkinson’s, hand dexterity is heavily hindered by tremors that arise from the disorder. This heavily alters any activity requiring even the most basic of coordination. In turn, making the activity much harder, or even near impossible causing those affected to require more time or even assistance with these tasks.

Fortunately, there are people who have worked to make a spoon with stabilising technology that drastically reduces the effects of the tremors on the spoon, allowing those afflicted to once again be independent and capable of feeding themselves in a cleaner and more efficient manner.

The spoon works through an algorithm that detects the movement of the hand, decides whether the movement is intentional or unintentional, and then compensates for unintentional movement by moving in the opposite direction. This has eliminated up to 70% of the movement brought about by the various conditions causing these tremors.

The act of eating is something that those without mobility/dexterity issues probably don’t give much thought about, and would probably concentrate more on the food itself. But these sorts of issues only make themselves apparent once we lose the ability to function on a more regular level. Thankfully with advancements in technology, much like prosthetic replacements, we can aid those affected to regain their independence, and that makes a whole world of difference to them on a physical, mental and emotional level.

 

PS. They also now have the Liftware Level, which caters more towards those with limited arm mobility

The field-trip to the ArtScience Museum was a pretty fun one. Aside from seeing examples in class, it’s really cool to actually go and see some of them in person and actually be able to interact with some of them. It gave me a sense of the wider range of considerations that go into making an interactive experience, especially in the logistics department. Having only done interactive work in a school setting, the sorts of length and breadth of things in the real world appears vastly deeper, but not in any bad way; it’s more of an eye-opening perspective on the management, running and transport of being involved in a show like this. (and I personally can see myself enjoying this sort of thing)

Speaking on the works themselves, I immensely enjoyed the works on display. From Stelarc’s mixing of electronics, body and the senses, to modified babies and even silk worms modified with spiders to make a super strong hybrid silk.

But for now I’ll be touching upon Neil Harbisson’s contribution to the exhibit. I first found out about him through a TED Talk that he did previously, so I was quite excited about his work being part of the show. Neil is colorblind and uses this over the head antenna attached to his skull to convey colour through sound; I thought this was pretty fascinating in this state alone

Now in the ArtScience museum there is an extension of his senses though a replica bust of his head with an antenna on it as well.

This will send the sounds to his own device through the internet. (his device has internet capabilities alongside it as well)

The concept of having an extension, replacement and or addition of the senses sends me pondering about the various aspects of the whole case.

First off, having an extension of a sense is not something i’d considered before, being physically present with your senses never felt like there was an alternative option. Now Neil has 5 difference places where his sense of colour would be, he even has one in space. What if you could smell the place where you spent your childhood, no doubt it is not the same thing, but it would send memories flooding back. For the disabled and the immobile, perhaps this is fertile ground to explore the next best thing. Mobile devices have made our world a smaller place by drawing everything we can think of at our fingertips, but our senses are pretty much untapped. Movies and theme parks too have been heard to try and bring smell, touch into the visual experience, but nothing in the everyday consumer level (even with movies).

It’s exactly because of the complexity of our senses that it’s so hard to replicate and package to send around. In the case of Neil, he mentioned that it took a while before his brain and the software lined up but it’s now come to the point where his brain has actually changed in it’s processing; he even dreams in colour, his mind is making up the sounds as opposed to the device in these dreams, and he says that this was the point that he felt like a cyborg. It was no longer a separate device but an extension.

Another aspect would be the more “legal” side of things. Who’s to say when an extension is permitted, should there be any constraints in the first place. What kind of extensions should be permitted. Even for him to find a willing surgeon to carry out the procedure no doubt took a while to get right. Moon Ribas, whose work is featured right next to Neil’s and is a seismic arm for users to experience earthquakes that Moon feels in real time. He is the co-founder, along with Neil for ‘The Cyborg Foundation’; which deals with cyborg rights and promotes being a cyborg as being socially acceptable.

And yet, at first glance, all the exhibit is, is a statue with a receiver of information.

The gravity of the work with all it’s complexities, nuances, issues, conflicts and life stories of those involved are tucked away, out of the quick gaze of the exhibits patrons.

The MoleScope is a medical tool to assist in the processing and screening of people in a preventive and/or post melanoma stage.

With the stats of 1 in 5 Americans being diagnosed with skin cancer, changes or appearances of precursory symptoms is an important field to survey.

However, the medical world is stuck in being able to give everyone enough time and attention to be vigilant for these symptoms. Screening everyone intensely creates a longer wait list for everyone, doctors would be overworked and medical fees would go up.

The MoleScope is a great tool in helping to indicate to the untrained eye if there is a cause for concern or not.

This however, is not the first product in the world of patient skin care to visually look for and database moles.

A previous example would be SkinVision, which is an app that would use the phone’s camera to take the snap. While it also analysed and tracked them, it’s shortcoming was in only using the phones camera; which, while good, also didn’t have the sub-dermal imaging capabilities the MoleScope provides with it’s dedicated device that attaches over the phones camera.

Images taken by the MoleScope are sent to a doctor who reviews the images and with a recommended course of action sent to the user.

The Molescope is also great in it’s comprehensive functionality. There is both a side for patients AND for healthcare professionals. Using their DermEngine web platform.

This provides both the doctor and patients a place to both go over their current state and the progression of any notable lesions to assess their current situation. This is an immense ease on the doctor as they can go over the information at a moments notice and helps eliminate the need for queues and all the administrative aspects that go along with a hospital visit. The patients also can feel at ease knowing that they can get medical attention as soon as something noteworthy shows up; additionally having the ability to review their own progress helps the patient in having a good grasp of the situation and in turn, puts them at ease too (plus transport costs and time are virtually gone).

In 2011, a much earlier version from another company called MelaFind was reviewed to be a little tricky in their diagnosis. While it only missed 2% of biopsy proven melenomas it also had a high false-positive rate (LINK to article). But the article also states that “…in the same clinical trial, a panel of dermatologists who did not use the device had an even higher false-positive rate.”

It’s been 6 years since then, and MelaFind was a rather large handheld device resembling an expanded hairdryer, technology has no doubt progressed since then and now MoleScope is merely a clip-on device to the users phone.

Herein lies another problem. The camera mount is phone-specific. In it’s initial release it was only available for the iPhone 5c to 6+ models with android coming shortly. However with the large variations in form factors on phones, this is ultimately going to leave some people unable to use the device.

Another noteworthy point I found from an article was that Android is critical in being considered for health apps “due to key health demographics disproportionally utilizing the Android platform.”

Though this one device it has brought me to realize that in some instances, the device capabilities extend far beyond what it holds in it’s physical aspects, but there are also secondary and even tertiary levels to consider, such as a stable website and platform + server database needs. Additionally as good as an idea might seen, it requires it’s intended service sector to adopt it, which means the spending of money and resources on new technology, which might not be very easy for the industry to immediately jump on or commit to.