Day 5: (Tue Sep 9, Week 3) Research Paper Review 1¶
Notes for 2025-09-09.
Notes from Day 4¶
Highlights of Precedent Work
Interactive Robotic Painting Machine, Ben Grosser, contextual interaction, co-creation with audience and environment, gestural personality
Double-Taker (Snout), Golan Levin, social robot, general audience, gestural personality
Colored Sculpture, Jordan Wolfson, kinetic performance, gallery audience
The Blind Robot, Louis-Philippe Demers, social robot, gallery audience, tactile interaction, deception
Hitchbot, David Harris Smith and Frauke Zeller, social robot, general audience, human actuation
other art examples mentioned in class: Helpless Robot
other research examples mentioned in class: Mechanical Ottoman [23], Rock-Paper-Scissors [22], grocery-aisle passing [21]
New Assignments¶
New assignment, due by start of class on Thursday (Sep 11): create a single-slide project pitch and submit to Assignment 3 Shared Folder. Please include at minimum: a title, a sketch, a brief interaction description, and a brief context description.
Administrative¶
Please remember: if you can’t make any of the usual EH&S fire extinguisher training sessions to qualify for laser access, IDeATe is hosting two locally, no sign-up required:
Tue, Sep 9, 2:30 PM in HL A4 Near (near vending machines)
Wed, Sep 10, 10:00 AM in HL 106C (Studio B in library lobby)
The first FriDeATe (social free food) event will be Friday, Sep 12 5-6PM in the basement open area.
Agenda¶
Research Paper Discussion¶
[23] D. Sirkin, B. Mok, S. Yang, and W. Ju, “Mechanical Ottoman: How Robotic Furniture Offers and Withdraws Support,” in 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Mar. 2015, pp. 11–18. Accessed: Aug. 22, 2025. [Online]. Available: https://ieeexplore.ieee.org/document/8520640
[22] E. Short, J. Hart, M. Vu, and B. Scassellati, “No fair‼ An interaction with a cheating robot,” in 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Mar. 2010, pp. 219–226. doi: 10.1109/HRI.2010.5453193.
[8] E. Dula, A. Rosero, and E. Phillips, “Identifying Dark Patterns in Social Robot Behavior,” in 2023 Systems and Information Engineering Design Symposium (SIEDS), Apr. 2023, pp. 7–12. doi: 10.1109/SIEDS58326.2023.10137912.
Comments on approaches to academic literature.
General discussion prompts:
How can we characterize the specific experimental process?
How would we characterize the specific machine-human interaction?
How are gesture and movement used to create deception?
What aspects of the work are general or theoretical?
How similar is the research process to your own discipline?
Were the implementation details familiar?
What surprised you most?
Is there a principle we can borrow for our work?
Resource Review¶
computing
Linux PC
Raspberry Pi
Raspberry Pi Pico
Arduino
small-scale sensing (mostly proximity and user interface)
sonar
light beams
proximity
capacitive touch
switches
vision
machine vision camera (Blackfly S Mono 1.6 MP GigE Vision (Sony IMX273))
web cams
LLM with image processing (e.g. gemma3)
actuators
small-scale: gearmotors, hobby servos
large-scale: window gearmotors, linear actuators, pneumatics
platforms
AI Makerspace: Cozmo, Misty, Kinova, Pepper
potentially: RC car, etc.
machine learning and language generation
Python libraries, e.g. scikit-learn
local Ollama model
Breakout Activity¶
For each project we need to identify:
purpose of interaction (e.g. functional, play)
type of interaction (e.g. transient, deliberate)
location
audience
the nature of deception (e.g. affect, secrecy)
the purpose of deception (e.g. social integration, human manipulation, information extraction)
the overt and covert text
a physical context: form and gesture
behaviors, either autonomous or puppeted