This blog post goes through some current technologies and products that are really helpful from the perspective of someone who is blind. It was really interesting to see the impact and potential of current technology in supporting those who are differently-abled in navigating through their daily lives with more ease and comfort. It’s also helpful to know what’s already out there and being done. I would definitely recommend taking a read through!
Like assignment 5, give data visual representation, but look at accessibility for someone without hearing or vision. Use interrupts to generate / modify the information being displayed, to control the information from another source. Experiment with more than one interrupt happening at the same time.
The example we discussed in class is how would you let a person without hearing know that someone was knocking at the door or ringing the doorbell?
SPI/I2C and complex communications protocols
How we get complex data from sensors – a lot of this is hidden in libraries
Unique IDs
Simple controls for complex output: neopixel
SparkFun’s version: Qwiic
Interrupts
Show examples of interrupt code in the environment
switches on mobiles
remote controls for the projectors
complex interrupt systems in video game controllers
rotary encoder (we’ll do a demo later in the semester)
for now, we only use digital inputs for interrupts
Code samples, show how an interrupt can be used to toggle a state by one increment compared to holding down a switch and falling through a number of states.
Note that holding down the switch means the interrupt service routine (ISR) only functions once
Compare to using delay() to sample data every so many units of time.
Use an interrupt to stop a task that takes a long time, say a long for() or while() loop, by adjusting the terminating conditions
Question: What if you were playing mp3 files or video, how would you use interrupts as part of the interface?
touching a person vs. moving an object – touch is more personal, requires little energy. Touch can be wrapped in a robotic device, ex: Paro (wiki)trade show demo.
Moving things typically requires an external power source, Arduino can only provide 5V at a few milliamps.
I was reminded of this today upon seeing Ghalya’s smart flower.
Dr. Lining Yao is right here on campus, where she runs the Morphing Matter Lab. In the video below from Google Design (9 minutes in) she demonstrates a flower printed to self fold into a flower. And then later she shows materials that morph when exposed to moisture (16 minutes in), and even later a soft robotic bunny that has tendons that actuate to hug you (29 minutes in). I recommend watching the whole video!
What if you could have an emotional support flower that you don’t have to worry about feeding or accidentally killing it. For this project I was inspired by Chromotherapy, a type of treatment that uses colors to treat diseases. Learn more behind the history and psychology of Chromotherapy here.
Project
With all this information in mind, my project creates an emotional support flower that reacts based on your emotions. For example if you are anxious, the flower will start to show calming colors in a soothing pace and pattern. When you do something great, it shows happy colors in an “excited” and “happy” pattern. And in the case that you do something bad, it calls you out on it but in a way that tells you you can do better next time, rather than shame you.
Inputs: heart rate and blood pressure data
Outputs: changing color, pace of changes, and gradient of color changes.
In future iterations, the flower could also incorporate excreting essential oils from it’s stamen. Smells are known to have the most impact on your brain out of all the senses. So having calming oils could really help calm an anxious person, for example.
Proof of Concept
I originally started by mocking the flower up with a 3D pen, but realized that the hardness and stiffness of it was not as soothing of an experience as I was going for. That is when I switched to a softer version made out of dried hot glue. I chose hot glue because it was a quick, low-budget way to get both the translucency and the softness I was looking for.
I often leave my room to go to the kitchen or answer the door and instead of going back to the work I was doing I get distracted by other things around my house. I wanted some kind of indicator that would alert me more frequently the more time I spent away from my work
I wanted to make something that would get your attention but not alarm you. Additionally, I thought it was important for my device to have a gradual change from a reminder to a more urgent indicator.
The Solution
Using an Arduino Uno, IR Break Beam Sensor and a Adafruit NeoPixel Ring. I created a project which pulsates red initially very slowly and as it reaches the 2 minute mark starts to speed up – eventually turning into a flashing light. My hope is that as it flashes more frequently it is more likely to catch the user’s eye – reminding them to go back to work.
How does a vehicle operator gain agency over potential objects in their blindspot? Often the solution is to check all mirrors, but there are very realistic occasions in which this does not work.
Solution
A series of vibrating and tapping motors would allow drivers to tell if there were vehicles or objects in there blindspot. The array of devices would be positioned under the seat and in the seatback, pulsing the left or right side to indicate blindspot object position. Additionally, while parked with the car off, the doorhandle would vibrate if the vehicle sensed incoming objects, such as in the case of a cyclist approaching the driver side door.
Proof of Concept
An ultrasonic sensor would serve as an object detector. When the ultrasonic senses an object in the near vicinity, the device will signal a solenoid motor to tap every few seconds, assuming the “vehicle is running”. If the vehicle is in fact not running, the device will signal the “car handle” dime motors, depending on which side the object appears.
You’re in a hurry on your way out the door for the day. You grab your keys, your wallet, your phone, and head for the door, but then you freeze… is it going to rain today?
Your bag is already stuffed to the max, and you don’t want to have to carry around an umbrella just in case, so you stop, pull out your phone to find the weather app, and look to see what the day has in store for you.
There’s 16 new notifications, and one of them distracts you just long enough for a new one to pop up telling you that you’ve missed your bus before you get a chance to discover that you’ll need that umbrella while you stand out in the rain waiting for the next bus to get you where you’re going 20 minutes late.
What if your stuff knew when you would need it. What if you immediately knew it was going to rain right as you reach for your umbrella, or better yet it pointed you to grab your sunglasses instead.
There is plenty of information available these days, but we always have to go hunting for it in a haystack of apps and notifications.
In the future we can teach our devices (not just our phones and smart devices) to fetch that information for us, and instead of just notifying us, they can take physical action in the real world. Instead of pestering us every time they have new information, or waiting for us to ask for it, they can present it at just the moment it’s needed.
You don’t care if it’s raining until you are about to walk outside, and you shouldn’t stop to check on your way out the door.
The Prototype
To demonstrate the basic idea of this type of physical indicator, a servo motor points at either your sunglasses or your umbrella based upon the message it receives from a weather API like Dark Sky. The response is parsed, and the device is sent a simple string saying either “sunny” or “rainy”. Based on that feedback it points either to the umbrella or the sunglasses.
Here’s a video:
The wiring and code are very simple in this prototype, but to implement the more futuristic versions described above wouldn’t really require much additional wiring or code.
The only pieces required are to embed a haptic motor, micro-controller, battery, and transceiver into the device. Inexpensive controllers like that already exist, but are not yet ubiquitous.
The other half of the equation is a system capable of taking in all of the IoT data in your environment and on your person, and understanding the bigger picture of what combination of triggers should wake up your umbrella just as you’re walking out the door. Smart home devices are getting closer to this every day.
Code + stuff:
The wiring is straightforward. Just a standard servo connection.
When driving a car, it can be easy to zone out and lose track of one’s speed. The method that current cars use to quantify speed is purely numerical (through a digital display)—an output that doesn’t do a great job in turning one’s speed into a visceral/understandable experience.
Proposed Solution:
My idea is to express driving speed (while still visually) as a kinetic object placed as a dashboard ornament. I think this would be a fun and intuitive way to display data rather than simply outputting a number.
More specifically, I want to use an accelerometer (represented in this iteration instead as a potentiometer) as an input device; and an oscillating dashboard ornament (powered by a servo motor) that changes oscillating speed in response to vehicle speed.
In the future, it would be interesting if I could make the device smart enough to process speed limit information as well to give the system a value to compare the user’s driving speed to.