by Putri Dina and Hannah Kwah

Project Proposal  |  28th Mar  |  4th Apr  |  11th Apr  |  12th Apr  |  15th Apr  |  16th Apr  |  17th Apr  |  Final Documentation

Particle Videos

Neutral State

Human Detected State







EditorX: http://infusionsystems.com/catalog/product_info.php/products_id/403


Select the USB port

Switch the sensor input that we are using. When pressed the touch sensor, a graph will be shown in the form of a wave to show the touch response. Multiple sensors can be used at the same time (up to 8 sensors).

Click the edit button to specifically manage the input

After clicking the edit button, a drop down will appear. This is optional for us to edit the values.

Touch v1.5: http://infusionsystems.com/catalog/product_info.php/products_id/135


Max Patch

Experimenting with the values from EditorX to MAX.


Feedbacks from LPD

  • Make the interaction richer
  • What happens when 2 sensors are touched at the same time
  • Currently, the project is at entry level, make it more complex
    E.g.: When user strokes both cheeks, the cat will purr. However, after a long period of time, it will get annoyed and respond differently.
  • The video can be more dynamic since it is currently focusing on levels of emotions. Add elements to surprise the user.


Moving forward
Finalize concept – what will happen when multiple parts are triggered.
Get the patches to work with multiple touch sensors
Get values for each sensor
Prototype & buy materials