Demo 4: Puppet Pet¶
The fourth demo emphasizes the combination of scripted and reactive behavior to create a personality. The objective is for each student pair to build a single device which can interact with a human through touch or motion, then develop individual scripts which present different affect.
Please keep your goals simple and commensurate with your skills. But please also help teach your partner skills if you are more experienced.
The pair will need to work together on a device with physical structure, actuation, and sensing. Pairs may also share core programming and motion primitives, but each student will be responsible for demonstrating an individual behavioral performance.
As before, the primary deliverable is a live in-class demo at the start of class on the due date, along with a brief blog entry.
Objectives¶
- work with a partner to develop a common physical platform for automated expression
- combine scripted and reactive behaviors in a single performance
- apply low-latency event-loop programming style
- develop reusable behavior primitives
- optional: develop custom notational system for efficient programming
Deliverables¶
- In-class demo at the start of class on the due date.
- Brief blog entry including:
- One or more embedded video clips of a person interacting with the mechanism.
- A brief paragraph outlining the intended behavior.
- Original CAD files as a zipped attachment (please, no Google Drive links; SolidWorks preferred).
- Arduino code (please use legible indentation and correct syntax highlighting).
Prompts¶
As with all these assignments, the threshold for success is set low to give beginners room to succeed, yet the full potential is limited only by your time and interest. But please focus on insightful solutions rather than brute development or implementation time.
The physical form makes expression and interaction possible, but should be kept simple so the emphasis is on programmed behavior. Our options for human interaction include switches, sonar, accelerometer, light, radar, or capacitance sensing (in increasing order of difficulty). In all cases, it is important that the sensing process evoke the semiotics of touch instead of user input. That is, the form should invite human gesture rather than data entry. So for example, reaching out to pet a cat is associated with comfort and care, unlike pressing a control on a panel, even if the underlying technical means is identical.
Similarly, actuation can be kept simple. Even a one-DOF device has an infinite-dimensional space of expression available by using time and trajectory. Even a single hobby servo can produce a variety of gestures if attention is paid to tempo, ictus, and shape.
A useful strategy is the formulation of motion primitives, which might be individual gestures which can be chained together into a performance. These might be programmed as individual subroutine functions, provided there is a means for interruption in response to an interaction input.
The behavioral properties we are exploring are shared by numerous artworks, toys, and therapy devices. Most of these are considerably more complex than time allows, so the challenge is abstracting the essence of responsive behavior into a simple form. A few examples:
- PARO
- PLEO
- companion pets
- Tickle Me Elmo
- My Real Baby, robot doll by iRobot and Hasbro
- Helpless Robot
- Rock Hard
Other references:
Criteria¶
- The key behavior requirement is that the interaction feels responsive. The performance should change quickly but coherently after a human makes a touch or gesture.
- You may use either the hobby servos or DC motors, but getting experience with the DC motors is recommended.
- The sensor system must be coherently integrated with the form rather than presented as a user interface. Even a one-switch interface is fine, but it should be incorporated into an interaction with the device rather than presented as a control. This is less of an issue for the non-contact sensors (sonar or light) as we don’t have established user-interface conventions for them.
- The devices must actually use the sensing to control behavior; no time-based simulation.
- The behavior has to be able to continue indefinitely, no one-shot exchange.
- The use of glue should be limited, I generally want to see sturdy construction which can be disassembled and repaired.
- Other rules are the same as before: live demo in class; cite any sources; properly embed video; make links active; properly format inline code.