Research Study: B

Themes: Gestural Input, interpretation, collaboration(?)

In this project, I want to explore the separation of body in space and how effectively can movements (both small, detail-oriented and grandiose) be translated by a machine to create art. I plan on using a soft sensing glove and a sponge ball along with a motion-tracking camera that targets the gloved hand in front of the participant. The primary input would be hand movement. Several feet away, there would be a robot with a paint tube in its core, moving on top of a flat canvas based on the participant’s hand movements and gestures. The paint will be dripping at a consistent rate but when the person squeezes the ball with a gloved hand, the pressure sensor will trigger a corresponding, gradual squeeze of the paint tube in the robot.


Jung, Boyoon & Sukhatme, Gaurav. (2010). Real-time Motion Tracking from a Mobile Robot. I. J. Social Robotics. 2. 63-78. 10.1007/s12369-009-0038-y.

Huang, C. M., Andrist, S., Sauppé, A., & Mutlu, B. (2015). Using gaze patterns to predict task intent in collaboration. Frontiers in psychology6, 1049.

Markovic, Ivan & Chaumette, François & Petrovic, Ivan. (2014). Moving object detection, Tracking and following using an omnidirectional camera on a mobile robot. Proceedings – IEEE International Conference on Robotics and Automation. 10.1109/ICRA.2014.6907687.

F. L. Hammond, Y. Mengüç and R. J. Wood, “Toward a modular soft sensor-embedded glove for human hand motion and tactile pressure measurement,” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 2014, pp. 4000-4007, doi: 10.1109/IROS.2014.6943125.

Leave a Reply