Assignment 4: Burning Down the House

I have to admit, I kinda thought this particular solution was a little cliche at first, but then just this week, I accidentally left a burner on low, and walked away for an hour. Luckily nothing got too badly damaged, but I’ve gained a new respect for practical solutions to everyday problems.

Here’s the plan:

Attach a sensor to the knob to know when it’s not in the off position. This could even be a simple switch (today we’re using a potentiometer in case we someday want to know how high the burner is set).

From there we add a sensor to tell when the cook has walked away. We don’t really want a visual indicator that’s always on when the burner is on, or we’ll learn to ignore it. Today I’m using the HC-SR04 provided in class.

From there it’s just a matter of selecting a timeframe, and an indicator. For the purposes of the demo, we’ll use 5 seconds, but in real life something like 5 minutes is probably about right. For an indicator, I’ve gone with a red LED for the demo, but perhaps a text message or IFTTT notification on my watch would be more practical long term.

Below I’ve laid out the state diagram, wiring of the demo, and a picture of it in action. There’s a link to the zipped code at the bottom.

Let me know what you think!

 

burner.zip

Honk Detector

Problem:

Not being able to hear when another vehicle is honking at you, or something else or from which direction that honk is coming from.

A General Solution:

A device that would visually notify the driver that someone is honking and from which direction that honk is coming from.

Proof of Concept:

An Arduino with sound detectors and LEDs/a visual interface.  When a sound above a certain volume is heard by the sound detectors, the corresponding LED lights up. If the same volume (with some tolerance) is picked up by both sound detectors, both LEDs light up. On a more realistic level, there would be detection for every direction away from the “vehicle” which would require more sensors.

Fritzing Sketch:

The Fritzing sketch shows how the sound detectors would input information to the Arduino as well as how the LEDs (in this case) would be connected to the output pins. I could not find the piece for the sound detectors in the Fritzing library so I replaced the sound detectors with potentiometers because the sound detectors can input analog data which would be more helpful than digital data in this scenario. Not pictured, is that the Arduino would have to be connected to some battery source.

Proof of Concept Sketches:

The flow of data starts with a car honking at the user’s vehicle, reaching the sound detector which then feeds an analog input into the Arduino which will decide which direction the sound is coming from. It will then turn on the corresponding LED or visualization to notify the driver that someone is honking at or near them.

Proof of Concept Videos:

Prototype:

Tolerance Study:

Files:

Assignment_4_Final

Loudspeaker Announcements For Those Who Can’t Hear Them – Assignment 4

Premise

Imagine you miss a flight announcement because it was only made over loudspeaker intercom. Or you’re at a train station and they announce that they changed your train’s platform but you didn’t hear it. Often times, a screen is not available and/or not updated in time because people assume everyone heard the announcements.

Proposal

I am proposing a program that uses the built in mic in your phone, laptop, watch or any smart device with a microphone to do two things:

  1. Recognize when an announcement is happening by letting the user know there’s a loud sound.
  2. Use speech recognition to translate what is being said onto your screen.

Proof Of Concept

The program starts off by showing the volume levels as a ball moving up and down the screen. If the volume surpasses a threshold, the screen turns red to alert the user. The user can then prompt the speech recognition by touching the screen. Then, when you touch the screen, or click your mouse, the speech recognition clears and starts over.

For some reason I could not get the entire code to work in one browser. The speech recognition would only work on chrome, meanwhile the volume would work on any other browser, but chrome. I think the reason for this is the high security and blocking of certain information access that chrome has built in, and I was unable to figure out how to surpass that. If you have any ideas please leave a comment… my code and the libraries I used can be found in the zip file below.

galsanea_speechRec

Assignment #4 – Cast Iron

Problem: I cook almost exclusively on cast iron.  It takes a while to heat up, but retains that heat super well.  There are often times where I have to step away from the skillet for a while to let it cool down.  I also moved in with new roommates recently who promptly touched the handle when it was still scalding hot.  It is difficult to tell how hot the pan is without holding the back of your hand up to it; there is a small visual indicator of heat coming off it, but it is minute.  The primary feedback is physical. A further issue is our individual heat tolerance–I can take and wash the pan / reseason it in the sink when its hotter than when my roomates can or want to.  It is impossible to tell just how hot it is without actually touching it, thermometer excluded.

Chance Assignment 4

Solution: Basically, heat indicators built into the pan.  This would allow a user to know at a glance if the pan were hot or not, instead of using their hand. This would be accomplished by a thermometer testing the surface of the pan.  Because cast iron pans heat fairly evenly, it has a fairly large margin of error in terms of where its sampling from.  No idea on how thermometers actually work so that may be wrong.

The embedded LEDs range from green to red, fairly common stop and go indicators. Red fairly clearly means hot when used on cookware as well.  I see the lights lighting up linearly from Not to Hot based on temperature, so only 1 LED would be in use if it were cold, and all in use when Hot.

I think this would at least ease the use of ambiguously hot pans, in a similar way to the audio feedback project by another student from Assignment #2.

Proof of Concept:

Chance Assignment 4

Chance Assignment 4

Fairly straightforward code that lights up LEDs in a row based on how large a value is read from a pot, acting as a thermostat stand in.  I am pretty sure I fried one of my pots but this does work fine, so hope for that demo in class!

// Chance, Assignment 4. Not really digital states, but approximations of analog ones

#define Serial SerialUSB

const int buttonPin = 2; // momentary button
const int greenLED = 13; // green LED
const int green2LED = 12; // green LED
const int yelLED = 11; // yellow LED
const int yel2LED = 10; // yellow LED
const int redLED = 9; // red LED
const int red2LED = 8; // red2 LED
const int potPin = 1; // analog 1

float LEDcount = 6.0f; // should just cast this later but fine to be declared as this
int startPin = 8;

void setup() {
// initialize the LED pins as an outputs:
  for (int i = red2LED; i <= greenLED; i++) {
    pinMode(i, OUTPUT);
  }
// initialize the pot pin as an analog input:
    pinMode(potPin, INPUT);
}


void loop() {
  // read in the pot / thermo stand-in
  int val = analogRead(potPin);
  Serial.println(val);

  for(int i = LEDcount - 1; i >= 0; i--) {
    if(val >= (((float)i / LEDcount) * 1023)) digitalWrite(i + startPin, HIGH);
  else
    digitalWrite(i + startPin, LOW);
  }
}

Visualizing the State of Headphones

Problem:

In the not-so-distant future when everyone is walking around with wireless headphones of some sort, how will people know who they can/can’t interact with? In some situations, I’ve tried to get the attention of someone wearing headphones to no avail; yet other times, I have no difficulty. In addition, sometimes people wear headphones because they don’t want to be bothered, while other times it’s simply of matter of wanting to be able to listen to music or Podcasts.

Proposed solution:

I propose implementing a visual system (using LEDs on the side of the headphones) to let others know whether or not the headphone user can/should be bothered. In the example that I made, a simple green LED indicates a low volume of music and no light indicates that loud music is playing. I think that this simple system has the potential to be a universally adopted method to differentiate the states of peoples’ headphones.

Proof of Concept:

I created a relatively simple Arduino circuit to prototype this interaction. It consists of a sound sensor as an input and a green LED as an output device. When the sound sensor reads above a certain threshold, the LED turns off—indicating that a user is listening to loud music and doesn’t want to be bothered. In practice, this was actually a bit harder to do, and I needed to implement a smoothing function to account for the variability of music as well as the specific sound sensor I was using.

Ardunio sketch, Fritzing, and Video demonstration

Assignment 04: Visualizing Door Bell While Taking A Shower

Problem: 

Even though I do not have any problems with my hearing, I became a temporary deaf in some situations. For example, when I am listening to music through my headphone (especially, the one with noise-canceling) I cannot even hear my friend talking right beside me or my phone ringing. When I am taking a shower in the bathroom, I cannot hear someone is pressing the doorbell. I decided to focus on these problems.

 

General Solutions: 

I tried to sketch some situations that are similar and decided to design a system that could visualize sound in many ways. For example, I come up with using motors to change shapes or using vibrators to show the movement of water ripples (which could be also a type of visualization). Other than using lights, there would be various ways to visualize the sounds. I decided to focus on the situation that I cannot hear the bell ringing while taking a shower. For this situation, I realized that using lights would be the best way because there are not many things I could let users get along while taking a shower.

 

Proof of Concept:

The concept is pretty simple. When the bell is pushed, the intensity of the light of the bathroom slightly changes, so that a user could notice the change easily. I tried to not to use additional LEDs or lights inside the bathroom because it would be burdensome and have water-related problems. I tried to design the light slightly changes with specific patterns so that a user won’t misperceive the lighting of the bathroom is something wrong.

 

Arduino Code & Demo Video:

Jay_Assignment_04

I tried to use P5 JS Serial Control to visualize on my computer, but it didn’t work well. 🙁 I need some help to solve the problem! (and I hope to learn JS as well.)

 

Assignment 4: Did I leave that state machine on?

State Machine(s): Kitchen Appliances

Problem: How many times have you made dinner one night and then woke up to a warmer apartment and/or kitchen the next day? Maybe a slight odor of gasoline? I can distinctly remember doing this twice… because I forgot to turn the stove off. 

My electric stove has a knob to turn it on and specify the level of heat. Underneath the knob, is supposed to be a helpful indicator light that reminds the cook that the stove is on. Finally, there is an oven timer the cook can set to remind them to take their food off the stove. While the indicator light is a great idea, it does not help a whole lot when the cook is away from the stove. Along with the slight buzz of the stove being on and heat radiating, all of the stove’s feedback is useful for a cook at the stove; however, it is less than useless for a cook who has left the kitchen.

Describe the general solution: In the smart house of five years from now, doorways could house monitoring systems for various appliances or systems that would ensure that someone walking around the house can see if things in other rooms are left on.

Proof of Concept: I wired up a potentiometer to represent a stove or oven knob and an LCD screen (that includes a potentiometer) to represent a monitoring system to the microcontroller board. Essentially, the LCD screen reads out an “on” or “off” state of whatever appliance is connected to it. This system could allow for multiple appliances to be held accountable by the monitoring system; it also could allow for different users/homes to set preferences for various alerts/notifications for various states (flashing screen, no text when things are off, etc.).

Assignment 4 Files: Frtizing Diagram, Arduino Sketches, Prototype Video

 

Assignment 4: Communicate Changes in State

Use visual information to communicate a change in state

We discussed examples in class to show how a visual indicator can show a change in state.  My microwave and washing mashing both make noises when they have completed an action and are ready for me to respond.  How could I be visually notified of a state change if I had hearing limitations; or if I were on the wrong floor and couldn’t hear the notifications with normal hearing?

Class notes: 12 September, 2019

State machine transitions

Documenting a state machine: Omnigraffle (mac) vs. SmartDraw (win10) vs. ??? (linux) vs. whiteboard.  One nice thing about mobile phones is that you can now do “save as” on a whiteboard simply by taking a picture.

Finite state machine

A finite state machine (FSM) needs states, transitions, and actions which are transitions that operate outside of the state machine.  We use the word “finite” to define that there are only a certain number of states in a machine and that a machine can only be in a single state.

This is a valid FSM:

automobile_door:  open_unlocked, closed_locked, closed_unlocked

This is an invalid FSM:

automobile_door:  open, closed, locked, unlocked

Ask yourself why one is valid and how the other could be invalid.

Multiple FSMs

Say you have two state machines for two actors in a game, “Barney” and “The Monster”.  Each will have it’s own set of states, but a change in one state — The Monster goes to “is visible” — sends a signal to Barney to changed to the state  “on_patrol” so it is looking for The Monster.

Where do Barney and The Monster exist?  They could be child state machines of The Encounter Room which has states of lights_on, lights_off, emergency_alert, and fire_sprinklers_activated.   Barney and The Monster can have transitions that also notify The Encounter Room and allow it to make decisions on changing it’s own state or sending state change mechanisms to Barney and The Monster.

How can we define an “action” and not a “transition”?

Movie transitions that also add to the plot:

Alerts of state changes that let you know an otherwise undetectable FSM has changed states:

  • microwave ding when it’s finished heating
  • countdown timer to start an event (from waiting -> running)
  • RFID EZPass validation light
  • elevator alerts for current floor and direction

When can visual interaction replace sound or motion?

  • baby monitor that translates sound to video
  • GFCI lights that indicate status of interrupt
  • replace sound warning with video flash, Mac Terminal

What sounds are important in FSM and what sounds are simply decorations?

Turning keypress sounds on and off on phones and keyboards

Dialer tones (DTMF) in response to pressing buttons on a mobile that doesn’t use DTMF

a range of car horns for different listeners

faked car sounds to impress the driver and passengers

Near Future Tech for Accessibility

There are a bunch of interesting applications of sensing with machine learning, but the most interesting to me is at 20 minutes in.

I have a friend with face blindness, and it has at times led to some awkward interactions when she sees people out of context. This would really help her (once they shrink it down a bit):

Of course we’re not really focusing on machine learning applications in this class, but the idea of using sensing to address a need for accessibility, which can have additional implications for a broader audience as well.