Visualizing Spaces

Problem:

For those who have impaired vision or are blind, understanding the quality and form of the spaces that they inhabit may be quite difficult to perceive (inspired by Daniel Kish’s TED Talk that Ghalya posted in Looking Outward). This could have applications at various scales, both in helping the visually impaired with way-finding as well as in being able to experience the different spaces they occupy.

A General Solution:

A device that would scan and process a space using sonars, LIDAR, photography, 3D model, etc. which would be processed then mapped onto a interactive surface that would be actuated to represent that space. The user would then be able to understand the space they are in on a larger scale, or on a smaller scale, identify potential tripping hazards as they move through an environment. The device would ideally be able to change scales to address different scenarios. Other aspects such as emergency situation scenarios would also be programmed into the model so that in the case of fire or danger, the user would be able to find their way out of the space.

Proof of Concept:

An Arduino with potentiometers (sonars/other spatial sensors ideally) to act as input data to control some solenoids which represent a more extensive network of physical actuators.  When the sensors sense a closer distance, the solenoids will pop out and vice versa. The solenoids can only take digital outputs, but the ideal would be more analog so that a more accurate representation could be made of the space. There are also two switches, one that represents an emergency button which alerts the user that there is an emergency, and one that represents a routing button (which ideally would be connected to a network as well, but could also be turned on by the user) which leads the solenoids to create a path out of the space to safety.

Fritzing Sketch:

The Fritzing sketch shows how the proof of concept’s solenoid are wired to a separate power source and is setup to receive signals from the Arduino as well as how all of the input devices are connected to the Arduino to send in data. The transducer for emergencies has been represented by a microphone, which has a similar wiring diagram. Not pictured, is that the Arduino and the battery jack would have to be connected to a battery source.

Proof of Concept Sketches:

The spatial sensor scans the space that the user is occupying which is then actuated into a physical representation and arrayed to create more specificity for the user to touch and perceive. This system would be supplemented by an emergency system to both alert the user that an emergency is occurring, and also how to make their way to safety.

Proof of Concept Videos:

Files:

Assignment_6_Final

Assignment 6: Physical Smoke Detector

Problem

Rabbit Laser Cutters have dark UV protective paneling to protect users from being exposed to bright, potentially vision damaging light. However, laser cut peices can begin smoking, and even catch fire. This presents a problem, how can user respond to fire and smoke events?

Solution

A visibility detection system paired with a motor would allow users to be afforded of an incoming smoke or fire issue by detecting drastic increases or decreases in visibility. The visibility detection system would be placed inside the laser cutter, while the motor would be attached to a wearable device, or atop the laser cutter to bump into it repeatedly in different patterns, creating different noises based on the situation and vibrations on the user’s person.

Proof of Concept

A series of temperature sensors would serve as the detection system. It would sense whether there was obstructed vision, either being too bright, signifying a fire, or too dim, signifying smoke. A solenoid would tap in a slow pattern to signify smoke, and tap in a hurried, frantic pattern to signify fire. The solenoid would be either attached to a wearable device, or attached atop the cutter itself, to tap against the machine and make noise, signifying to the user to press the emergency stop.

smokeDetectorlow

Assignment_6 2

Feel and Communicate Through Morse Code

PREMISE

How do you create a universal communication method that can work for everyone, whether they are blind, or deaf, or both? Imagine a universal translation machine that can….

Proposal

To tackle this, I decided to use a tactile way to feel and translate Morse code. This is done through a combination of:

  1. haptic feedback → as someone is communicating to you, the translator device would vibrate the Morse code pattern to you so you can tactically feel it. This is ideal not only for someone who cannot see or hear, but also if you want to be extra discreet and not make any noise or visual distractions.
  2. visual feedback → adding to that, the visual feedback provided is twofold: through letter translation and through the blinking of an LED. The letter translation is especially ideal for someone who might not necessarily know Morse Code.
  3. audio feedback → finally, audio feedback through the buzzer helps you distinguish by sound when what you are pressing is a dot(*) or a dash(-). When you press long enough for the device to recognize that it is not a dot anymore, but a dash, the tone changes.

The hope here is by providing different ways of feedback, the translator can be more accessible.

Proof Of Concept

 

Fritzing Sketch

MorseCode

Look this Way!

Lip reading is difficult, and a large portion of the deaf community choose to not read lips.  On the other hand, lip reading is a way for many deaf people to feel connected to a world that they often feel removed from.  I have linked a powerful video where one such lip reader talks about the difficulty of lip reading but they pay-off she experiences by being able to interact and connect with anyone she wants.

Lip reading relies on being able to see the lips of the person speaking. When you are interacting with one person, this is not an issue, but what if you’re in a group setting? How do you keep track of who is talking and where to look?

Idea

Using 4 sound detector or microphones, detect the area in which sound is coming from. Alert the user of this change in sound by  using a servo motor to point  in the direction of the sound. This allows people who are hard of hearing to understand who is talking in a group setting and focus on the lips of the person speaking at hand.

Proof Of Concept

To demonstrate this idea, I decided to use 2 sound detectors and a servo motor. My interrupt is a switch which can be used to override the process if, for example, there are too many people talking or they do not need to use this device anymore.

Below is a breadboard view of my project.

Fritzing

Assignment6

I am having some issues after updating to Catalina, so videos will come soon!

Alarm Bed

State Machine: Sleeping

Problem: Waking up is hard. Lights don’t work. Alarms don’t work. Being yelled at doesn’t work. You have to be moved to be woken up. But what if no one is there to shake you awake?

General solution: A vibrating bed that takes in various sources of available data and inputs to get you out of bed. Everyone has different sleep habits and different life demands, so depending on why you are being woken up, the bed will shake in a certain way. How?

  • Potential continuous data streams
    • Google Calendar: if an accurate calendar is kept and you can program certain morning routines like cleaning up and eating breakfast, your bed could learn over time when it should wake you up for work/school depending on traffic patterns/weather/other peoples’ events (kids, friends, etc.)
    • Sleep data: lots of research has been done on sleep cycles and various pieces of technology can track biological data like heart rate and REM stage, your bed could learn your particular patterns over time and wake you up at a time that is optimal within your sleep cycle
  • Situational data streams
    • High Frequency noises: if a baby cries in the room next door or one of your home’s alarms goes off, your bed could shake in a faster/more violent manner to make sure to get your attention
    • “Kitchen wake-up button”: if one of your roommates or family members won’t get out of bed, you can flip a switch in a different room to shake the bed without having to go into their room

Proof of Concept: I connected the following pieces of hardware to create this demo:

  • Servo motor: represents the shaking bed
  • Potentiometer: represents a timer, as well as higher frequency sounds (main sources of input/data)
    • The motor turns on to different intensities/patterns depending on where the potentiometer is set to
  • Push button: represents the “kitchen wake-up button”, works as an interrupt within the program
  • Slide switch: represents the “off button” for the bed, works as an interrupt within the program

Assignment 6

Interactive Theater

Chance and I went to Bricolage Production Company’s latest creation, Project Amelia, tonight. It’s “a next-level immersive theater experience that invites you to the R&D lab of Aura, one of the world’s most innovative tech giants, to participate in the launch of a groundbreaking intelligence product like no other.” Their words, not mine. It’s a cool take on traditional theater with lots of clever uses of simple interactive devices that we could make for this class. You wear an RFID tag bracelet throughout the show to interact with various games and demos that are aimed at teaching guests about the power of artificial intelligence. The show runs for a few more weeks, so check it out!

Project Amelia

Life of a Blind Girl

https://lifeofablindgirl.com/2018/05/16/21-things-i-couldnt-live-without-as-a-blind-person/

This blog post goes through some current technologies and products that are really helpful from the perspective of someone who is blind. It was really interesting to see the impact and potential of current technology in supporting those who are differently-abled in navigating through their daily lives with more ease and comfort. It’s also helpful to know what’s already out there and being done. I would definitely recommend taking a read through!

Assignment 6: Interrupts and motion

Due Mon night, 11:59 pm

Like assignment 5, give data visual representation, but look at accessibility for someone without hearing or vision.   Use interrupts to generate / modify the information being displayed, to control the information from another source.  Experiment with more than one interrupt happening at the same time.

The example we discussed in class is how would you let a person without hearing know that someone was knocking at the door or ringing the doorbell?

Class Notes: 10 October, 2019

Input classification, serial communication, interrupts

Types of Input

monophonic + skill: wind instruments, percussion

polyphonic + tech: keyboards, pianos, organs, strings
anthropomorphic: respond to human condition, heart rate, blood pressure, galvanic skin, breath rate, pulse rate, visual interpretation of secondary movements: eye twitch, touching your face, blinking

Golan Levin “Opto-Isolator

Serial Communication

SPI/I2C and complex communications protocols
How we get complex data from sensors – a lot of this is hidden in libraries
Unique IDs
Simple controls for complex output: neopixel
SparkFun’s version: Qwiic

Interrupts

Show examples of interrupt code in the environment
switches on mobiles
remote controls for the projectors
complex interrupt systems in video game controllers
rotary encoder (we’ll do a demo later in the semester)
for now, we only use digital inputs for interrupts

Code samples, show how an interrupt can be used to toggle a state by one increment compared to holding down a switch and falling through a number of states.
Note that holding down the switch means the interrupt service routine (ISR) only functions once
Compare to using delay() to sample data every so many units of time.

Use an interrupt to stop a task that takes a long time, say a long for() or while() loop, by adjusting the terminating conditions

Question: What if you were playing mp3 files or video, how would you use interrupts as part of the interface?

zip file with some example interrupts