Class Notes: 19 September, 2019, reading and listening assignments

Story of Your Life, by Ted Chiang.  A short story about alien language told only using the alphabet.   The movie based on this story, “Arrival”, was the visual version of a foreign alphabet “Armied up” so it has more tension.  Christopher Wolfram shows how he created the language for the movie with links to his code (it’s long and probably boring if you don’t think in Mathematica).  A much shorter, more philosophical question about the meaning of language uses “Arrival” as its base.

Two listening pieces, one based on “Deafspace” architecture, the other on wayfinding (which we really haven’t discussed in class yet)

Deafspace

Walk This Way

 

 

 

Class Notes: 17 September, 2019

Visualization of Sound and Language

Visualization of sound by audio frequencies

Visualization of simple tones

Visualizing music in real time using sound

Visualizing music using sheet music

Posters visualizing songs based on the sheet music for the song.

Bach’s Cello Suite No. 1 as a poster, as sheet music, and performed by Yo-Yo Ma.  Compare the visualization of the poster with the instructions in the sheet music.

Visualization of language

Visual ASL dictionary with full demonstrations of each word.

Written ASL (which we did not discuss in class), a notation system of translating ASL hand motions to marks on paper.

Compare ASL visualization of words to Braille‘s visualization of letters and word components.  Note that the shapes of the letters in Braille do not map to the shape of the letters used in print.

Class notes: 12 September, 2019

State machine transitions

Documenting a state machine: Omnigraffle (mac) vs. SmartDraw (win10) vs. ??? (linux) vs. whiteboard.  One nice thing about mobile phones is that you can now do “save as” on a whiteboard simply by taking a picture.

Finite state machine

A finite state machine (FSM) needs states, transitions, and actions which are transitions that operate outside of the state machine.  We use the word “finite” to define that there are only a certain number of states in a machine and that a machine can only be in a single state.

This is a valid FSM:

automobile_door:  open_unlocked, closed_locked, closed_unlocked

This is an invalid FSM:

automobile_door:  open, closed, locked, unlocked

Ask yourself why one is valid and how the other could be invalid.

Multiple FSMs

Say you have two state machines for two actors in a game, “Barney” and “The Monster”.  Each will have it’s own set of states, but a change in one state — The Monster goes to “is visible” — sends a signal to Barney to changed to the state  “on_patrol” so it is looking for The Monster.

Where do Barney and The Monster exist?  They could be child state machines of The Encounter Room which has states of lights_on, lights_off, emergency_alert, and fire_sprinklers_activated.   Barney and The Monster can have transitions that also notify The Encounter Room and allow it to make decisions on changing it’s own state or sending state change mechanisms to Barney and The Monster.

How can we define an “action” and not a “transition”?

Movie transitions that also add to the plot:

Alerts of state changes that let you know an otherwise undetectable FSM has changed states:

  • microwave ding when it’s finished heating
  • countdown timer to start an event (from waiting -> running)
  • RFID EZPass validation light
  • elevator alerts for current floor and direction

When can visual interaction replace sound or motion?

  • baby monitor that translates sound to video
  • GFCI lights that indicate status of interrupt
  • replace sound warning with video flash, Mac Terminal

What sounds are important in FSM and what sounds are simply decorations?

Turning keypress sounds on and off on phones and keyboards

Dialer tones (DTMF) in response to pressing buttons on a mobile that doesn’t use DTMF

a range of car horns for different listeners

faked car sounds to impress the driver and passengers

Class Notes: 10 September, 2019

State machines in our world

Reactive vs. interactive state machines

Reactive state machines

  • Egg timer – set time, start, stop, reset
  • Arcade game – attract / play / reward
  • Advanced arcade games that change their offer for cost of a game

Interactive state machines

Student examples in class (post a comment if I forgot your example):

  • Murphy bed attached to lights for getting up in the morning
  • microwave that can adjust cooking time based on humidity
  • automatic systems that open blinds when sun is out to save money on lighting

Discuss how and why of state changes

What if state machines were smart(er)?  What are predictive state machines?

  • auto-aiming / target tracking — not military, self-driving cars and obstacle avoidance
  • catch a Frisbee — dogs know calculus!
  • When you order pizza you usually order beerand could make predictions?
  • You’re running low on gas, stop to refuel/recharge before you go to the grocery store

When you go from home to studio your default playlists are changed

  • Visual changes in state
  • Doppler shift in nature
  • lives / health in a video game
  • traffic slowdowns in Waze
  • intensity of light => time of day

Mini-assignment for Thursday

– Install IFTTT, look at existing apps
– Post some interesting state machines to Looking Outward

Class Notes: 3 September, 2019

Visual display of information

The less types/kinds and amount of visual feedback you give the better

A clock on the classroom wall needs hours and minutes, but does it need seconds? Days?  Months?  Does your monthly wall calendar have entries for the time of day?  How about the day calendar on your desk?

Fundamental types of visual information include:

  • color
  • motion
  • intensity
  • type of display: LED, projection on a wall, display on a screenComplex types of visual state are based on the fundamental types:
  • typeface
  • language
  • icons
  • images

Visual skeuomrophism — a look that contains nonfunctional design cues.  A calendar application that looks like a paper wall calendar.

Why they are called “radio buttons”.

A soda machine that just dispensed a drink:

Icons mean different things in different cultures, does your car have email and bacon?

Class Notes: 3 September, 2019

Class stuff

Alternative Technologies Maker’s Fair

My office hours this semester will be Thursdays, 4:30-6:30pm in Hunt A10.  If you need to meet at some other time we can probably work out a Skype call.

Setting up the SparkFun RedBoard Turbo

Installation instructions

Official Sparkfun hookup guide.

If you have a working board, please read the details of how this is an improvement on the Arduino.

Debugging

Test things in this order.

  1.  Do you have a good USB cable?  If you plug in a USB cable and the RedBoard LEDs don’t light up, it’s a bad cable.  In my studio I just tried 8 cables and 1 did not light up the RedBoard unless I forced it in to an odd position.  If the cable connection isn’t super-snug between the cable and your RedBoard, try other cables.   My good cables lit up all of the RedBoards, including ones that didn’t work in class.
  2. Reboot your system after installing the SparkFun updates to the Arduino IDE.
  3. Can your laptop “see” the RedBoard over USB?   This is a bit trickier, but on Win10 the Arduino IDE should show “SparkFun RedBoard Turbo” in the text on the list of ports.
  4. If you’re getting a lot of red text that looks like compilation errors, try compiling the script in the installation instructions.  It should sequence the LEDs near the USB connector.
  5. Serial() doesn’t work.  Use SerialUSB() instead.

Setting up p5.js / serial

There are two steps here.

First, set up p5.js using these instructions.  Don’t use the web editor, download the p5.js package and use a local editor.  Run a few “Hello World” sketches to verify that things are working correctly.

Second, install p5.serialcontrol.  There’s a set of instructions at ITP for making the connection between p5.js and the Arduino, but if it doesn’t work for you on the first go, don’t waste a lot of time trying to sort it out.  We’ll do that on Thursday at the start of class.

Class Notes: 29 August, 2019

Physical Interaction History

We have a history of interacting with things we don’t understand, starting with keeping animals and early science.  As new technology is developed we find new, unintended uses and create new arts and sciences.

  • Beekeepers – explain the complexity of a hive and how we’re only just now (past 20 years) discovering how bees vote to make hive-wide decisions
  • Alchemists – trying to make things happen with substances they don’t understand. If you don’t know about elements and that lead and gold are elements, what decisions are making to interact with substances?
  • New music and dance styles based on evolution of technology.  Electric guitars led to massive rock concerts and new methods of performance.  Early hip-hop was created with record players and mics.  Early synths were insanely expensive and shared at fancy studios; cheaper samplers and drum machines led to hip-hop, techno, house, ambient, etc.  See Ishkur’s Guide to Electronic Music.
  • Console games with physical interaction, Dance Dance Revolution
  • Driving assistance with vehicles: AI? Interaction? What if the car won’t let me turn, start, or stop?
  • Flight assistance software in commercial aircraft. Is autopilot interactive?
  • Flight assistance for military aircraft: self-guiding drones, incoming missile warnings for helicopter pilots. Interaction or reaction?

Near-future Interaction

The focus of this class – we’re prototyping for five years out

Think about interacting with intelligent systems that we don’t completely understand and that can make decisions against our will or with results we don’t like:

  • car that thinks you’re too intoxicated, sleepy, or incompetent to drive
  • home automation system that won’t open the doors and let you leave because the particulate count in the air is hazardous
  • police equipment that won’t let you fire at unarmed civilians
  • fire engines that won’t engage fires that cannot be contained
  • entertainment systems that can filter content as part of mental-health
  • near future popular base of pop-culture: MCU  movies are always set a few years from now

Learning from pop culture

Blade Runner 2049 features practical effects used as input devices to imagined systems

Adam Savage gets a tour of the Blade Runner 2049 prop room, no spoilers

Industrial design from the 50s has interaction design in the kitchen but it’s marketing fantasy to build the corporate brand:  Design for Dreaming

Good drama is about storytelling. What if interactive things are part of the story?
2001: HAL 9000
ST:TNG: DATA — a walking mobile phone smarter than spacecraft computers?  What if all the spaceships were as smart as DATA?
Colossus: The Forbin Project (1970) our super computer meets the Soviet supercomputer and they plan our future
War Games (1983) — would you like to play a game? the only way to win is to not play the game.
Terminator movies don’t count — killing spree, not interaction
Farscape — Moya is a semi-intelligent, organic spacecraft and agrees with commands as she wishes.  (IMHO This is one of the best SF TV shows ever, really worth watching over winter break.)
Alien — MOTHER, a semi-intelligent computer that has its own direction

Class Notes: 27 August, 2019

Class Theme – “Accessiblity”

We’re looking at how we can make physical things more accessible.
We do this by improving the human condition by improving living spaces with tangible interaction design.

Nathan Shedroff’s list of interaction components.

  • Assist
  • Enhance
  • Improve
  • Qualify
  • Sense

Introduction to Tangible Interaction

Reaction vs. Interaction

Classic thermostat (temperature sensor and on/off switch) vs. smart thermostat (PID controller https://en.wikipedia.org/wiki/PID_controller or AI google hive mind)
Tangent: explain how PID is different from a sensor + relay
Explain how machine learning is different from PID

Questions:

  • What if we had a smart (AI) thermostat?
  • change heating/cooling controls based on history
  • change temperature related to outside environment
  • react to weather changes
  • modulate temp based on who is in the house: I like it warm, spouse likes it cold
  • modulate temp based on predicted activities: “they always stay up late on Friday”
  • error control: “never let the house go below 50F” to prevent pipes from freezing

Short History of Tangible and Interaction Design

Physical computing and tangible interaction design are recently created fields but there is a history of how we got here. The key point is the size (scale) of computing hardware

Water powered tools and windmills
beginning of the PID idea, centrifuges to maintain speed in grain mills

Industrial revolution
early punch-card computing
steam engines that can react to malfunctions
sophisticated PID for steam engines

Transistors as second industrial revolution
first computers that didn’t fill buildings

Beginning of HCI
1976: first use of “human-computer interaction” in a published paper
1983: The Psychology of Human-Computer Interaction brings the concept to the general computing community

Early arcade games with haptic outputs/feedback (helicopter game that shakes when you are shot)

Modems and broadband access
Interaction moves from an isolated software package to a network of software packages
People can interact with other people at distant locations
People can interact with systems
Still no physical interaction

Mobile phones
contains sensors, CPU, network access
has output in the form of image, sound, and vibration

Arduino
first affordable, usable embedded controller
opened up a market of input/output hardware
set the space for Rpi, BBB, etc

Five years from now
where we’re thinking in this class

What can we do in this class?
Study physical computing and interaction
Look at near future concepts
Design, build, and demonstrate physically interactive devices and systems