Day 5: (Tue Sep 9, Week 3) Research Paper Review 1

Notes for 2025-09-09.

Notes from Day 4

Highlights of Precedent Work

New Assignments

New assignment, due by start of class on Thursday (Sep 11): create a single-slide project pitch and submit to Assignment 3 Shared Folder. Please include at minimum: a title, a sketch, a brief interaction description, and a brief context description.

Administrative

  1. Please remember: if you can’t make any of the usual EH&S fire extinguisher training sessions to qualify for laser access, IDeATe is hosting two locally, no sign-up required:

    • Tue, Sep 9, 2:30 PM in HL A4 Near (near vending machines)

    • Wed, Sep 10, 10:00 AM in HL 106C (Studio B in library lobby)

  2. The first FriDeATe (social free food) event will be Friday, Sep 12 5-6PM in the basement open area.

Agenda

Research Paper Discussion

[23] D. Sirkin, B. Mok, S. Yang, and W. Ju, “Mechanical Ottoman: How Robotic Furniture Offers and Withdraws Support,” in 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Mar. 2015, pp. 11–18. Accessed: Aug. 22, 2025. [Online]. Available: https://ieeexplore.ieee.org/document/8520640

[22] E. Short, J. Hart, M. Vu, and B. Scassellati, “No fair‼ An interaction with a cheating robot,” in 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Mar. 2010, pp. 219–226. doi: 10.1109/HRI.2010.5453193.

[8] E. Dula, A. Rosero, and E. Phillips, “Identifying Dark Patterns in Social Robot Behavior,” in 2023 Systems and Information Engineering Design Symposium (SIEDS), Apr. 2023, pp. 7–12. doi: 10.1109/SIEDS58326.2023.10137912.

  1. Comments on approaches to academic literature.

  2. General discussion prompts:

    • How can we characterize the specific experimental process?

    • How would we characterize the specific machine-human interaction?

    • How are gesture and movement used to create deception?

    • What aspects of the work are general or theoretical?

    • How similar is the research process to your own discipline?

    • Were the implementation details familiar?

    • What surprised you most?

    • Is there a principle we can borrow for our work?

Resource Review

  1. computing

    1. Linux PC

    2. Raspberry Pi

    3. Raspberry Pi Pico

    4. Arduino

  2. small-scale sensing (mostly proximity and user interface)

    1. sonar

    2. light beams

    3. proximity

    4. capacitive touch

    5. switches

  3. vision

    1. machine vision camera (Blackfly S Mono 1.6 MP GigE Vision (Sony IMX273))

    2. web cams

    3. OpenCV

    4. LLM with image processing (e.g. gemma3)

  4. actuators

    1. small-scale: gearmotors, hobby servos

    2. large-scale: window gearmotors, linear actuators, pneumatics

  5. platforms

    1. AI Makerspace: Cozmo, Misty, Kinova, Pepper

    2. Dobot Magician Lite robot arm

    3. Pololu Balboa

    4. potentially: RC car, etc.

  6. machine learning and language generation

    1. Python libraries, e.g. scikit-learn

    2. local Ollama model

Breakout Activity

For each project we need to identify:

  1. purpose of interaction (e.g. functional, play)

  2. type of interaction (e.g. transient, deliberate)

  3. location

  4. audience

  5. the nature of deception (e.g. affect, secrecy)

  6. the purpose of deception (e.g. social integration, human manipulation, information extraction)

  7. the overt and covert text

  8. a physical context: form and gesture

  9. behaviors, either autonomous or puppeted