Class Notes: 27 August, 2019

Class Theme – “Accessiblity”

We’re looking at how we can make physical things more accessible.
We do this by improving the human condition by improving living spaces with tangible interaction design.

Nathan Shedroff’s list of interaction components.

  • Assist
  • Enhance
  • Improve
  • Qualify
  • Sense

Introduction to Tangible Interaction

Reaction vs. Interaction

Classic thermostat (temperature sensor and on/off switch) vs. smart thermostat (PID controller or AI google hive mind)
Tangent: explain how PID is different from a sensor + relay
Explain how machine learning is different from PID


  • What if we had a smart (AI) thermostat?
  • change heating/cooling controls based on history
  • change temperature related to outside environment
  • react to weather changes
  • modulate temp based on who is in the house: I like it warm, spouse likes it cold
  • modulate temp based on predicted activities: “they always stay up late on Friday”
  • error control: “never let the house go below 50F” to prevent pipes from freezing

Short History of Tangible and Interaction Design

Physical computing and tangible interaction design are recently created fields but there is a history of how we got here. The key point is the size (scale) of computing hardware

Water powered tools and windmills
beginning of the PID idea, centrifuges to maintain speed in grain mills

Industrial revolution
early punch-card computing
steam engines that can react to malfunctions
sophisticated PID for steam engines

Transistors as second industrial revolution
first computers that didn’t fill buildings

Beginning of HCI
1976: first use of “human-computer interaction” in a published paper
1983: The Psychology of Human-Computer Interaction brings the concept to the general computing community

Early arcade games with haptic outputs/feedback (helicopter game that shakes when you are shot)

Modems and broadband access
Interaction moves from an isolated software package to a network of software packages
People can interact with other people at distant locations
People can interact with systems
Still no physical interaction

Mobile phones
contains sensors, CPU, network access
has output in the form of image, sound, and vibration

first affordable, usable embedded controller
opened up a market of input/output hardware
set the space for Rpi, BBB, etc

Five years from now
where we’re thinking in this class

What can we do in this class?
Study physical computing and interaction
Look at near future concepts
Design, build, and demonstrate physically interactive devices and systems

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.