I wanted to create a smart watch interface for someone with anxiety and panic disorder. It reads someone’s pulse and breath rate and can detect when someone is having a panic attack. I used real-time biometric data from a pulse sensor and a wind sensor and hooked it up to an ardiuno, then I used serial communication to send the data and trigger & display different calming graphics using p5.js. I will go through the project interface and design first, then I will show you how I went about doing it.
INTERFACE DURING A PANIC ATTACK
Depending on the kind of panic attack you are having, there are 3 modes/techniques. The good thing is that any of these techniques would work during a panic attack. I weighted certain techniques to be triggered based on someone’s sensor data. But for the purpose of the scope of this project, and because this requires more knowledge, time and research, I used keyboard keys to to demonstrate the different interfaces.
Deep Breathing Technique (for when you’re hyperventilating)
Counting Technique (for when your heart rate cannot be controlled)
Distraction Technique (for when you need to be distracted from your overwhelming thoughts)
These techniques and methods were based on my research on medical articles here, here, and here.
“Recently I found that trying to focus on something else such as counting numbers… works, because it requires concentration.”
Proof of Concept
Testing the pulse sensor and how it communicates using the p5.js serial port:
Testing the wind sensor and how it communicates with the p5.js graphics:
Going through all the different types of interfaces and how they react to the user’s biometric sensor data and interactions:
How to Implement it
I used the step by step guide provided by NYU’s physical computing department here to learn how to use ardiuno and p5.js together. If you follow that guide, you can know how to download and run the project. You will need to download the P5.js complete library, the P5.serialserver, and the ardiuno software.
Have you ever been home and didn’t want to interact with anyone or have you ever wanted to scare off unwanted guests? What if you had a security system that works by scaring people away at the door?
A system that scares people away using tapping and buzzing. The closer you get, the faster the tapping. If you get really close, then the doorknob vibrates. The idea here is that it only scares people who don’t see it coming (ie uninvited people who won’t leave you alone). If you know about the system then it won’t scare you, and the only way you would know about the system is if you’re the person who set it up or if you tell invited people.
What if every object in your home had a different pitch, and as you walk around and touch different things, you create a melody? Everything in the space around you can become part of the musical instrument!
This instrument takes in different inputs and uses them to trigger notes. Each object can be assigned a note, and whenever you touch that object, that note plays. Once the object is connected, you can assign and reassign any note you want to that object through a dial.
8 inputs to plug in your different objects.
1 speaker to play the different notes.
2 dials & a button to choose which object you want to change to which note.
How does that even work…?
Here, I used an orange, a banana, a compass and my laptop case. You can use anything you want that can conduct electricity! In the case of the laptop case, or other non conductive objects, you can use something like thin conductive tape to conduct. Note at the end of the video, you can use the dials and the button to change a specific object’s assigned sound.
I have an idea on a final project, please let me know if you have any advice or feedback- I would greatly appreciate it!
I want to create a smart watch interface for someone with anxiety and panic disorder. I plan on using real-time biometric data from sensors and using the data to trigger and display things using p5.js.
The watch has two modes: Normal Mode & Panic Mode.Normal Mode includes a watch interface that displays the time and date, in addition to the sensor data in an artistic, data-visualization way (I am thinking something similar to a mood visualizer type of thing). The panic mode can be triggered through two ways: a panic button the user presses or sensor data that indicates the user is having a panic attack. In Panic Mode, the canvas cycles through the following anxiety relieving techniques:
Deep Breathing Exercise: using calming graphics to help guide the user through a deep breathing exercise. I will use online resources to figure out how the breathing exercise need to be in order to work, like WebMed’s Techiques for deep breathing.
Body Scan: using the body scan technique found here.
Distraction/Game Technique: using a jigsaw puzzle or some sort of mind occupying game that reduces stress but still allow you to channel your overactive brain somewhere.
5 Senses Technique: using the 5 senses to ground you, as shown below:
If all of the following techniques do not work, then this triggers a “call emergency contact” state, which calls someone you designated as a person to reach out to. For example, “calling your mom…”
The biometric sensors I am thinking of using are: a heart rate (PPG) sensor, a GSR sensor, and a respiratory rate sensor. The last one, I might not need, I am waiting to confirm with a specialist…
If you’re at a dance or somewhere crowded, how do you alert the masses without causing a panic?
This project uses interrupts to stop a melody and buzzes, slowly increases in pitch to gain people’s attention. In future iterations, the device could sense when room is quiet the switch to playing the message that was needed to get everyone’s attention.
Proof of Concept
I used the potentiometer to control the speed of the melody and the buzzer intervals.
My family and I struggle with a progressive nervous system disorder that causes an essential tremor that starts in your hands when you’re younger (i.e. me), and migrates throughout your body as you get older (i.e. my mom).
For this project I wanted to look into ways to help stabilize things you’re holding if you have a tremor. I made this device that uses an accelerometer to detect movement, and offset that movement by using 2 servo motors to control the x and y rotations.
There are three different state options: stabilizer: help for when you need to hold something still; pouring: help for when you need to pour something; normal: device does nothing.
Proof of Concept
Because of the complexity of offsetting movements and the fact that I am not knowledgeable enough with physics, I found it really difficult to make the stabilizer and pouring states work together. Hence, the demonstration above only shows the stabilizing state.
Adding to that, I struggled with the adjustment between the relationship of input data and sensitivity/stability of the device. In other words, I didn’t know how to make sure the device doesn’t jitter as much while reading live data. For future iterations, learning how to normalize the input data should help.
How do you create a universal communication method that can work for everyone, whether they are blind, or deaf, or both? Imagine a universal translation machine that can….
To tackle this, I decided to use a tactile way to feel and translate Morse code. This is done through a combination of:
haptic feedback → as someone is communicating to you, the translator device would vibrate the Morse code pattern to you so you can tactically feel it. This is ideal not only for someone who cannot see or hear, but also if you want to be extra discreet and not make any noise or visual distractions.
visual feedback → adding to that, the visual feedback provided is twofold: through letter translation and through the blinking of an LED. The letter translation is especially ideal for someone who might not necessarily know Morse Code.
audio feedback → finally, audio feedback through the buzzer helps you distinguish by sound when what you are pressing is a dot(*) or a dash(-). When you press long enough for the device to recognize that it is not a dot anymore, but a dash, the tone changes.
The hope here is by providing different ways of feedback, the translator can be more accessible.
What if you could have an emotional support flower that you don’t have to worry about feeding or accidentally killing it. For this project I was inspired by Chromotherapy, a type of treatment that uses colors to treat diseases. Learn more behind the history and psychology of Chromotherapy here.
With all this information in mind, my project creates an emotional support flower that reacts based on your emotions. For example if you are anxious, the flower will start to show calming colors in a soothing pace and pattern. When you do something great, it shows happy colors in an “excited” and “happy” pattern. And in the case that you do something bad, it calls you out on it but in a way that tells you you can do better next time, rather than shame you.
Inputs: heart rate and blood pressure data
Outputs: changing color, pace of changes, and gradient of color changes.
In future iterations, the flower could also incorporate excreting essential oils from it’s stamen. Smells are known to have the most impact on your brain out of all the senses. So having calming oils could really help calm an anxious person, for example.
Proof of Concept
I originally started by mocking the flower up with a 3D pen, but realized that the hardness and stiffness of it was not as soothing of an experience as I was going for. That is when I switched to a softer version made out of dried hot glue. I chose hot glue because it was a quick, low-budget way to get both the translucency and the softness I was looking for.
I consider myself a musician but I cannot read sheet music. I play more than 6 instruments, but I play by ear since I never got the training or patience to read sheet music. One day I was playing one of my favorite instruments, the piano, and I wondered if there could be an equivalent to “playing by ear” for someone who cannot hear..? How do you take the act of hearing music and turn it into a visual interaction that teaches you as you play?
My project takes music and creates an interactive game that simultaneously translates the music into keys. Ideally, the program would have 2 different inputs: 1) audio input ⇒ microphone hears a song and the program recognizes and processes the tones, then translates them into piano keys, 2) sheet music input ⇒ takes digital sheet music and reads it, then teaches you how to play the song.
Play a piano by “ear”
Proof Of Concept
To prove this concept, I decided to work on the interface of the game. Due to limited time to work on this project, the program reads a randomly generated digital notes and plays that. If I were to take this further, I would add the possibility of pressing more than one key at once.