For my final project, I decided to draw from my work from the kinetic critique, and further explore how a radial array of vibrating motors can not only serve as a feedback device for the blind and hearing impaired, but also how different patterns can carry various meanings and emotions. While the original concept remains similar to that of the previous critique, I completely revisited the context, physical construction, and code structure of the device. In addition, I wanted to really flesh out a specific use case for the device, so I created a robust language and alert system specifically revolving around the act of time keeping.
The previous iteration of the haptic armband consisted of four vibrating motors arranged in a circle around the forearm. One thing I learned from that process is that four manipulation points isn’t quite enough to convey the feeling of rotation and feels more like four separate actuators (which they are). In constructing the improved version, I decided to use six vibrating motors arranged in a similar manner for a couple reasons: first, this fixed the problem of feeling like the device was only separate actuators rather than a radial array; second, having six manipulation points made developing a time keeping language feel much more natural since it more closely parallels the indices on an analog clock. However, any more points and it would become difficult for users to distinguish sensations between adjacent motors. Finally, I made this version adjustable as to create a more comfortable feeling for users.
A Haptic Language
Since I wanted this device to not only be able to convey emotion, but also information, I went into more depth by creating a haptic language that parallels telling time with an analog clock. Two buzzes are used to indicate the hour, one buzz for the ten minute, and a series of buzzes to describe the exact minute. While I considered foregoing the exact minute since knowing the approximate time is often sufficient, I did not want to sacrifice content for clarity, and instead chose to communicate time down to the minute. In addition, I prototyped several other patterns such as tap, stroke, and grab to be used in different scenarios requiring varying levels of urgency.
As a student of industrial design, I would like to further explore the form and aesthetics of this object through CAD renderings and other physical prototyping means. In addition, while I did use a great deal of trial and error to define the haptic language I created, I would like to further refine it to add clarity and efficiency for the user. Finally, I’d like to think of even more ways that this device can embody intelligence and become truly interactive such as being able to sync with Google calendar and responding to changes in users’ patterns.
Current home alarm systems often use motion detectors positioned outside/around the house in order to detect potential threats to a home’s safety. However, these systems rarely take into account parameters such as the detected motion’s speed, sound, and other such patterns. Because of this, small animal movements and other anomalies can cause false alarms—making these systems unreliable.
For this project, I chose to focus on speed as a specific use case for this outdoor alarm system. Depending on the speed of the motion detected by the multiple break beam sensors, the system emits different sound patterns to embody various levels of urgency. For example, a fast motion would likely be reason for alarm, thus being associated with the least pleasant sound and signifying that users may want to call authorities. On the other hand, a slow motion emits a less intense tone that tells users that they may just need to check on what’s happening.
In today’s world, there are many carefully considered alarm tones that are designed to be played from mobile phones or other speakers in order to wake users up slowly and gradually. However, some people are such deep sleepers that these alarms have no effect; and something much stronger and more visceral is required.
With this project, I sought to create a visceral sound using kinetic output that is both loud and jarring enough to wake even the deepest of sleepers. To do this, I drew on my own experiences with balloons; and how when they pop, everyone in the room is stunned into silence. Using a servo motor with a connected pin as the actuator, I also integrated a timer and a start button using a simple potentiometer and a push button. Using these things, users are able to set a timer that terminates in the loud popping of a balloon.
Audio announcements are often used to deliver information to a large group of people; airports, restaurants, stores, and museums are all prime examples of areas where this is common practice. However, since people have different preferences as well as linguistic and cognitive ability, these sorts of audio announcements would be made more accessible if people were able to control certain aspects of this audio.
My proposed solution is demonstrated rather simply, and takes advantage of the Arduino tone function’s use of internal time keeping. This allows a looped audio track (that represents an announcement in this case) to be interrupted at any time to change its speed. Using a potentiometer, users can change the speed of the audio by a factor between 0 and 2 times.
Haptic feedback as a means of delivering information to the visually impaired isn’t a new concept to this class. Both in class assignments and in products that already exist in the real world, haptics have certainly become a proven tool. However, I feel that there has not been much consideration as to the more specific sensations and interactions that haptics can provide.
With this project, I attempted to create a haptic armband that adds another dimension of feedback: spacial. By arranging haptic motors radially around the arm, I was able to control intensity, duration, as well as surface area in order to create different sensations. Controlling these variables, I recreated the sensations of tap, nudge, stroke (radially), and grab.
In terms of applications, I think time keeping could be a great illustration as to how different sensations can play a role. For example, a gesture such as a tap or nudge would be appropriate for situations such as a light reminder at the top of the hour — on the other hand, a grab would be more suitable in situations such as an alarm or if a user is running late for an appointment. Other more intricate gestures such as a radial stroke could be for calming users down in stressful situations.
Morse code is commonly received through either visual or audible feedback; however, this can be challenging for those who are blind, deaf, or both. Additionally, I had next to no experience using hardware interrupts on Arduino, so I wanted to find a good application of interrupts for this assignment.
I wanted to create a system that allows morse code senders to quickly adapt their messages into signals that people without sight or hearing can understand. To do this, I created two physical button inputs—the first button directly controls an LED (but could easily be a buzzer) that is used to send the morse code signal; the second button toggles a vibrating motor to buzz in conjunction with the LED. In this way, one can change the message being send from purely visual to both visual and tactile at any time.
When driving a car, it can be easy to zone out and lose track of one’s speed. The method that current cars use to quantify speed is purely numerical (through a digital display)—an output that doesn’t do a great job in turning one’s speed into a visceral/understandable experience.
My idea is to express driving speed (while still visually) as a kinetic object placed as a dashboard ornament. I think this would be a fun and intuitive way to display data rather than simply outputting a number.
More specifically, I want to use an accelerometer (represented in this iteration instead as a potentiometer) as an input device; and an oscillating dashboard ornament (powered by a servo motor) that changes oscillating speed in response to vehicle speed.
In the future, it would be interesting if I could make the device smart enough to process speed limit information as well to give the system a value to compare the user’s driving speed to.
Throughout much of the travel process—especially revolving around airlines and the flying experience—hearing is critical in order to get you to your destination. Audio announcements are constantly being made over airport intercoms such as flight changes, boarding calls, and for lost items. On the plane as well, audio announcements are used by the captain and flight crew to communicate things to the passengers throughout the flight.
I propose a handheld device that people who are hearing impaired can pick up during the check in process that replaces all the audio announcements they may hear while traveling with tactile (haptic feedback) and visual (LCD screen) notifications. The reason for a dedicated device is that cellular reception/service is often unreliable, especially when traveling outside of the country. Once users reach their destination, they can simply return the device before exiting the airport.
This solution would certainly require airports and airlines to change the way they operate in order to create a more inclusive environment; however, I think such a system would be very beneficial for the community, and may even have benefits for helping with the language barrier during international travel.
Proof of Concept:
In order to prototype my concept, I created an Arduino circuit using a LCD screen, a haptic motor, as well as three input buttons to simulate different scenarios that one may run into while traveling.
When one of the input buttons is pressed (representing an announcement being made), the haptic motor will vibrate in a specific pattern before a textual message is displayed on the screen such as, “Your flight SW815 is now boarding at gate 11!”
Messages are kept short so that users can receive the information they need easily, and they can go online or to a help desk if they need further assistance. My hope is that users who travel frequently will be able to learn the different vibration patterns for different messages in order to create a more seamless notification system.
In the not-so-distant future when everyone is walking around with wireless headphones of some sort, how will people know who they can/can’t interact with? In some situations, I’ve tried to get the attention of someone wearing headphones to no avail; yet other times, I have no difficulty. In addition, sometimes people wear headphones because they don’t want to be bothered, while other times it’s simply of matter of wanting to be able to listen to music or Podcasts.
I propose implementing a visual system (using LEDs on the side of the headphones) to let others know whether or not the headphone user can/should be bothered. In the example that I made, a simple green LED indicates a low volume of music and no light indicates that loud music is playing. I think that this simple system has the potential to be a universally adopted method to differentiate the states of peoples’ headphones.
Proof of Concept:
I created a relatively simple Arduino circuit to prototype this interaction. It consists of a sound sensor as an input and a green LED as an output device. When the sound sensor reads above a certain threshold, the LED turns off—indicating that a user is listening to loud music and doesn’t want to be bothered. In practice, this was actually a bit harder to do, and I needed to implement a smoothing function to account for the variability of music as well as the specific sound sensor I was using.
Automatic restroom appliances such as faucets and hand dryers are becoming increasingly popular in public and private applications alike. While this technological innovation has many befits in terms of convenience, sanitation, and energy efficiency, it is also the source of a great deal of frustration when things don’t function as intended.
All too often, I’ve found myself waving my hands vigorously underneath an automatic faucet to no avail. One of my main complaints about the interaction is that there’s no way to know which part of the state machine isn’t functioning properly. Could it be that the sensor isn’t seeing my hands because I didn’t position them properly? Is there simply a delay before water begins to dispense? Or is the sensor malfunctioning altogether?
To solve this problem, I propose implementing a visual feedback system (using multicolored LEDs) to inform users whether any malfunction is due to their own error, or if it is a fault in the electronic system.
Proof of Concept:
I wired two different colored LEDs (green and red) to serve as a simple, intuitive, visual representation of the states of an automatic faucet. The LED s are directly linked to the infrared distance sensor that serves as an input to the system. If the green LED turns on, the sensor sees the user’s hands. If the red LED is on, the distance sensor is reading a value outside of its range, indicating that the electronic system is broken. In the case that the green LED turns on but water isn’t dispensing after a few seconds, users will know that there is malfunction in the hydraulic system.