Alarm Bed 2.0

Problem: Waking up is hard. Lights don’t work. Alarms don’t work. Being yelled at doesn’t work. You have to be moved to be woken up. But what if no one is there to shake you awake?

Target Audience: Deaf/hearing-impaired children. Proper sleep is a habit that needs to be taught from a young age to help lead to healthier lifestyles later in life. Children with hearing impairments face another barrier to “learning how and when” to sleep because they miss some of the important audio cues that trigger and aid sleep. First, there is a high level of sleep insecurity for children with hearing impairment. Children that can hear are usually comforted by their parents’ voices during a bedtime story or by the normal sounds they hear around the house; however, children that cannot hear do not have that luxury – they have to face total silence and darkness alone, which is a scary thing for people of all ages. In addition, some of these children use hearing aids during the day which gives them the even sharper/more noticeably disturbing reality of some noise throughout the day and then total silence.

General solution: A vibrating bed that takes in various sources of available data and inputs to get you out of bed. Everyone has different sleep habits and different life demands, so depending on why you are being woken up, the bed will shake in a certain way. How?

  • Continuous data streams
    • Google Calendar: if an accurate calendar is kept by a child’s parents and they can program certain morning routines like cleaning up and eating breakfast, your bed could learn when it should wake you up for work/school
      • Can make this decision depending on traffic patterns/weather/other peoples’ events (kids, friends, etc.)
      • This process could also teach kids how to plan their mornings and establish a routine.
    • Sleep data: lots of research has been done on sleep cycles and various pieces of technology can track biological data like heart rate and REM stage, your bed could learn your particular patterns over time and wake you up at a time that is optimal within your sleep cycle
  • Situational
    • High Frequency noises: if a fire alarm or security alarm goes off, a child that cannot hear would usually be forced to wait for their parents to grab them and go. This feature could wake them up sooner and help expedite a potential evacuation process
    • “Kitchen wake-up button”: Kids do not always follow directions… so here, a parent can tap a button in a different room to shake the bed without having to go into their child’s room
      • Button system also has a status LED that shows
        • Off = not in bed
        • On = in bed
        • Flashing = in bed, but alarm activated
  • Interacting with User
    • Snooze: if sleeper hits a button next to their bed three times in a row, then the alarm will turn off and not turn back on
    • Insecure Sleep Aid: if the bed senses that a child is tossing and turning for a certain amount of time as they get into bed, then it can lightly rumble to simulate rubbing a child’s back or “physical white noise feedback”
    • Parents: if your kid gets out of bed, you can have your bed shake as well if this is linked throughout the house

Proof of Concept: I connected the following pieces of hardware to create this demo:

  • Transducers: shakes bed at a given frequency
    • One located by the sleeper’s head
    • One located by the sleeper’s feet
  • Potentiometer: represents an audio source
    • The transduer turns on to different intensities depending on where the potentiometer is set to
  • Push button 1: represents the “kitchen wake-up button”, works as an interrupt within the program
  • LED : a part of “kitchen wake-up button” that represents status of bed
  • Push button 2: represents the “snooze” feature of the bed, where a sleeper can turn off the rumbling to go back to sleep
  • Flex Resistor: represents a sensor in the bed that determines if someone is in the bed or not

Summary: A device like this could help children (and adults) who cannot hear to feel more secure in their sleep and encourage healthier sleep patterns. One of the biggest potential challenges with the device would be finding a powerful enough motor/transducer that can produce a variety of vibrations across a heavy bed/frame (and the quieter the better, for the rest of the home’s sake). Even with that challenge, though, everyone should have a good night’s sleep and this could be a way to provide that.

Files: KineticsCrit

Crit 2: Kinetic

Due 11:59pm, 28 October.

Combine kinetic inputs, outputs, data, and state machines to create a physically interactive system that changes interaction based on inputs and logic.

The example I gave early this semester was a “doorbell” for someone who cannot hear.

Inputs: doorbell, physical knock, person detector

Interaction: use inputs to determine output.  Doorbell + no person detected means someone rang the bell and walked away, was this a UPS/FedEx delivery?  Knock and person is there, is someone coming to visit?  To sell a product?  “Secret” knock pattern used by friends and a person is there, one of your friends has come to visit.

Output: Create appropriate output for the results of the interaction process.  UPS/FedEx drop off is lower priority than a friend coming for a visit.

Class notes: 22 October, 2019

Kinetic Input/Output

The Demo”, showing off the mouse, chord keyboard, and social media.

Accessibility / Inclusion

Microsoft’s Inclusion and a PDF copy of their book.

What is accessible?

Are 30mm arcade buttons are accessible? Do they simply nterrupt or do they provide constant state?  Are the buttons convex or concave?  How high are the guards around the buttons?  If you want to use Universal Design, how do you decide how big the button should be and where it’s located?

What is wrong with the E-Stop button in A10?

  • Unlit
  • Recessed button “hidden” in a guard
  • No signage on the wall like we have with fire extinguishers

Are controls like buttons the wrong answer?  Is better output the way to go?

tactile maps

presentation of research data on tactile map comparison

tactile graphics using “swell paper”

3d printing reference objects for the blind — what does a snowflake look like?  A butterfly?  A sailboat?

Assignment/schedule

Kinetic crit on 29 Oct.  Thursday office hours + Thursday class is a work day.

Assignment 6: Balance Checker For Visually Impaired

Problem

I started with a few problems related to balance for especially visually impaired people. Firstly, when they are moving a pot with hot soup inside, it is really dangerous because they cannot check the balance of it. Another situation will be that, when they are building furniture, especially shelves, it is important to maintain the horizontal balance to keep things safe.

Also, for sighted people, there might be many situations when the balance is important. For example, when we are taking a photo.


General Solution

How might we use tactile feedback to let them feel the tilt or unbalanced things? I thought that the vibration with various intensity would be a great way to do it.


Proof of Concept

I decided to use the iPhone for two reasons. First, I realized that it has capabilities to generate a variety of diverse tactile feedbacks using different patterns and intensity, I found it useful to take advantage of its embedded sensors. Lastly, I thought that making a vibrating application will be useful to provide higher accessibility to many people.

I categorized three different groups of the degree to provide different tactile feedback in terms of intensity. When a user tilts the phone 5~20 degrees, it makes light vibration. From 21~45 degrees, it generates medium vibration. From 46~80 degrees, it generates intense vibration. Lastly, from 81~90 degrees, it vibrates the most intensely (just like when it receives a call). I also assigned the degree numbers to RGB code, to change the colors accordingly.


Video & Codes

 

Adapting Morse Code for the Blind

Problem:

Morse code is commonly received through either visual or audible feedback; however, this can be challenging for those who are blind, deaf, or both. Additionally, I had next to no experience using hardware interrupts on Arduino, so I wanted to find a good application of interrupts for this assignment.

Proposed Solution:

I wanted to create a system that allows morse code senders to quickly adapt their messages into signals that people without sight or hearing can understand. To do this, I created two physical button inputs—the first button directly controls an LED (but could easily be a buzzer) that is used to send the morse code signal; the second button toggles a vibrating motor to buzz in conjunction with the LED. In this way, one can change the message being send from purely visual to both visual and tactile at any time.

Proof of concept:

 

Arduino Code and Fritzing Sketch

 

Assignment 6: Thumper – turning sounds into touch

This project is inspired by the Ubicoustics project here at CMU in the Future Interfaces Group, and by an assignment for my Machine Learning + Sensing class where we taught a model to differentiate between various appliances using recordings made with our phones. This course is taught by Mayank Goel of Smash Lab, and is a great complement to Making Things Interactive.

With these current capabilities in mind, and combining physical feedback, I created a prototype for a system that provides physical feedback (a tap on your wrist) when it hears specific types of sounds, in this case over a certain threshold in an audio frequency band. This could be developed into a more sophisticated system with more tap options, and a machine learning classifier to determine specific signals. Here’s a quick peek.

On the technical side, things are pretty straightforward, but all of the key elements are there. The servo connection is standard and the code right now just looks for any signal from the computer doing the listening to trigger a toggle. The messaging is simple and short to minimize any potential lag.

On the python side, audio is being taken in with pyaudio, and then transformed into the frequency spectrum with scipy signal processing, and then scaled down to 32 frequency bins using openCV (a trick I learned in ML+S class). Then bins 8 and 9 are watched for crossing a threshold, which is the equivalent of saying when there’s a spike somewhere around 5khz toggle the motor.

With a bit more time and tinkering, a classifier could be trained in scikit learn with high accuracy to trigger the tap only with certain sounds, say a microwave beeping that it’s done, or a fire alarm.

The system could also be a part of a larger sensor network aware of both real world and virtual events to trigger unique taps for the triggers the user prefers.

 

thumper.zip