Bud: The Desk Friend

Problem:

For people who work at desks, it can be hard to track when one is tired or needs a break. In these scenarios, they might be inclined to push themselves too hard and damage their health for the sake of productivity. When one is in a state of extreme exhaustion, it is very easy to make simple mistakes that could’ve otherwise been avoided or be unproductive in the long-run. With a cloudy mind, it can be difficult to make clear, thought-out decisions as well. In essence, knowing when to get rest vs. when to continue can be impossible in certain work situations.

A General Solution:

A device that would be able to detect if someone is awake, dozing off, asleep at their desk, or away from the desk. In the case that the person is awake, they will get a periodical reminder to get up, stretch, and hydrate. If the person refuses to take a break, the alarm escalates in intensity, but eventually stops either at the push of a button, or after 3 ignored reminders. In the case that the person is dozing off, the alarm will try to wake them up and encourage them to take a nap.  In the case that the person is asleep, the device will set an alarm to wake the person up after a full REM cycle. This can also be triggered using a button to signify that the person will begin taking a nap which will set a timer to wake the person up after a full REM cycle.

Proof of Concept:

An Arduino connected to a speaker as a communicative output which utilizes p5.js and ml5.js to ascertain the state of the user (awake, dozing, asleep, or away from keyboard (afk).  The system uses a machine-learning model looking through a laptop’s camera to detect the four previously mentioned states. The system should have two type of states, one in which the Arduino is looking for whether the person is in an asleep state or is planning on going to sleep and another in which it acts as a timer, counting down to sound an alarm.

Fritzing Sketch:

The Fritzing sketch shows how the speaker is hooked up to the Arduino to receive outputs. Not pictured, is that the Arduino would have to be connected to the laptop which houses the p5.js code and webcam. The power source pictured allows for various sizes of speakers with different power requirements to be implemented.

Proof of Concept Sketches:

The user’s pitch (or their perceived state) is sensed using a webcam that feeds live footage to a machine learning model which then informs the device of whether the person is awake, dozing off, asleep, or afk. The computer then sends a signal to the Arduino to play music to cue the user to act accordingly.

Process/Challenges:

This system could absolutely be further improved to take into account one’s schedule, sleep data, etc. to begin making other suggestions to the user. Additionally, this system could be implemented into another table top feature, for example a plant, a sculptural element, or a lamp so that it functions as more than an alarm clock.

Throughout this project, I ran into a couple of initially unforeseen issues. In trying to implement a connection to Google Calendar data as another aspect of the project, I quickly found myself hitting a wall in terms of the depth of my p5.js coding experience and skill level in realizing the feature. I also initially intended to use recordings to humanize “Bud” to make it more relatable and friendly, but both the p5.js and the DF Player Mini were extremely challenging to implement. Both only worked using the 9600 baud rate serial on my Arduino and the DF Player Mini was not supported by the Sparkfun Redboard Turbo board due to a library that wasn’t compatible. I scaled back to use the Volume library to generate simple tunes and sound effects to represent the messages instead.

Lastly, one of the more difficult problems to address was the connection between the Arduino and p5.js. Because the machine learning model was checking through the webcam for the state of the user quite often (at the set/standard frame rate) all of the data was being sent to the Arduino which then was backed up with all of the data, causing it to lag far behind what was happening in real time. I tried a couple of interventions, making it so that p5.js wouldn’t send the same status twice in a row and reducing the frame rate as much as possible, but if the machine learning model isn’t taught very carefully, the system is still susceptible to failure. Ideally, the system would be able to address that concern by ensuring that it constantly deletes old data and only uses the most updated data, but I wasn’t able to figure out how to ensure that.

The logic of the system, however, is present in the Arduino, so it might be possible, and more practical, to use an accelerometer (or other sensors) to send a more direct and controlled flow of data to the Arduino. The system might also utilize data from a smart watch or phone to send data to the Arduino in a later iteration.

Proof of Concept Video:

Files:

Final Crit

Desk Alarm Clock

Problem:

For people who work at desks, it can be hard to track when one is tired or needs a break. In these scenarios, they might be inclined to push themselves too hard and damage their health for the sake of productivity. When one is in a state of extreme exhaustion, it is very easy to make simple mistakes that could’ve otherwise been avoided or be unproductive in the long-run. With a cloudy mind, it can be difficult to make clear, thought-out decisions as well. In essence, knowing when to get rest vs. when to continue can be impossible in certain work situations.

A General Solution:

A device that would be able to detect if someone is awake, dozing off, or asleep while working at their desk. In the case that the person is awake, they will get a periodical reminder to get up, stretch, and hydrate. In the case that the person is dozing off, the alarm will try to wake them up and encourage them to take a nap. This can be triggered using a button to signify that the person will begin taking a nap which will set a timer to wake the person up after a full REM cycle. In the case that the person is asleep, the device will set an alarm to wake the person up after a full REM cycle.

Proof of Concept:

An Arduino with an accelerometer to represent the state of the user (awake, dozing, or asleep) with a button to allow the user to signal when they are intentionally planning on taking a nap.  While the system can be fully functional in this manner, the system could also use a machine-learning strategy through a camera to detect the three previously mentioned states as well. The system should have two type of states, one in which the Arduino is looking for whether the person is in an asleep state or is planning on going to sleep and another in which it acts as a timer, counting down to sound an alarm.

Fritzing Sketch:

The Fritzing sketch shows how the accelerometer and switch are set up to feed information into the Arduino as well as how the speaker is connected to the Arduino to receive outputs. Not pictured, is that the Arduino would have to be connected to the laptop which houses the p5.js code. In addition, the potentiometer here is included as volume control (for my ears’ sake).

Proof of Concept Sketches:

The user’s pitch (or their perceived state) is sensed using an accelerometer (or camera) which informs the device of whether the person is awake, dozing off, or asleep. The user can also input a signal that they will be taking a nap which will also set the alarm for a full REM cycle. This system could be further improved to take into account one’s schedule, the frequency of the naps, etc. to begin making other suggestions to the user.

Proof of Concept Using Accelerometer Video:

ml5.js Proof of Concept Video:

Files:

Sound Crit Final

Faucet Training

Problem:

Both in private and public settings, the use of physically actuated faucets can be a confusing when users encounter automatic/motion-actuated faucets in their daily lives as well (is this just me?). Remembering to shut off the faucet, especially completely, is often overlooked and is a significant waste of water depending on the duration it is ignored. Because the sound of water can easily be tuned out given how often we encounter it, it is important to communicate this information to users whether for the environment or the bill payer’s sake. In addition, most people use (I hope) soap when they wash their hands, but they don’t need the water to be running while that is happening, so ideally, the water should be shut off briefly while that event is happening to save water as well.

A General Solution:

A device that would represent the flow of water using percussion to signify when it is still running. The tempo of the percussion should convey whether water is running. Ideally, it should be a sound that is distinct from the sound of water running from a faucet. It should take into account when other events are happening as well, such as soap being dispensed. Potential augmentations of the system could be using sensing when people are in front of the sink vs. leaning to dispense soap or other tracking of the users position to understand when the user needs/doesn’t need water.

Proof of Concept:

An Arduino with a potentiometer to represent the faucet being turned on/off, an LED to represent the water (whether it is running or not), and a switch/button to represent when soap is being dispensed. The potentiometer being rotate on will cause a Servo to sweep faster which hits straws to create a beat. The LED, based on the press of a button, will switch on and off regardless of the potentiometer, but unless the button is pressed, the LED will show the potentiometer’s reading of how much water the user wants.

Fritzing Sketch:

The Fritzing sketch shows how the potentiometer and button are set up to feed information into the Arduino as well as how the LED and Servo are connected to the Arduino to receive outputs. Not pictured, is that the Arduino would have to be connected to a power source.

Proof of Concept Sketches:

The user’s turns on the faucet and is met with a drumming that conveys the flow of the water. If they are sensed to be dispensing soap, the water stops briefly. The system is meant to remind the user to make sure that the faucet is completely closed when they leave. There are however, many additional features that could be added on as the scale of the intervention increases, for example, the drumming could be active only if it is sensed that there is someone present in the space.

Proof of Concept Video:

Files:

Assignment 8_Final

Apple Alert System

Problem:

iPhone’s have embedded in them a system for notifying users of when serious events are happening nearby (ex. AMBER alert, flash flooding, dust storms, tornadoes). Because the notification system is the same sound and vibration pattern across all of the scenarios, people may be numbed to the alert over time and can begin to ignore it. In addition, because the alerts are the same, the sound and vibration don’t convey information that may be more specific to inform users, for example people who are visually impaired, of what is wrong.

A General Solution:

A device that would interrupt the current happenings of the device to communicate the alerts to the user using specific tonalities to convey the urgency of the scenario, as well as give an indication as to what is happening. This would be ideally done through a combination of sound and vibration to make sure that people would be able to both hear the audio feedback and feel the vibrations to comprehend the scenario.

Proof of Concept:

An Arduino with potentiometers to represent the volume that the user prefers to listen music to and their proximity in relation to the danger in the case of a flash flood, tornado, or dust storm, three switches to serve as input for what alert is in effect, and one button to represent when there is an alert in effect. In regards to output, the system will need to output sound through a speaker and would ideally be extended to include vibration.

Fritzing Sketch:

The Fritzing sketch shows how the potentiometers, switches, and button are set up to feed information into the Arduino as well as how the speaker is connected to the Arduino to receive outputs. Not pictured, is that the Arduino would have to be connected to a power source.

Proof of Concept Sketches:

The user’s phone receives information regarding the alerts automatically and that information is converted into specific audio and (hopefully) vibration feedback for the user, so that they immediately know the situation and whether or not they should respond.

Proof of Concept Video:

Files:

Assignment_7_Final

Feeling Color

Problem:

For people who either color blind or blind, seeing and comprehending color, which is embedded in many aspects of our lives as an encoder of information, can be very challenging, if not impossible. In addition, color adds another dimension to our experiences and enhances them as well.

A General Solution:

A device that would be able to detect color and send the information to actuators to display the information through vibration. Ideally, the system would be able to vary the specificity and accuracy of the detection (in terms of frame rate, but also in regards to the sample area used for color detection) based on the velocity of the user using it or other input variables.

Proof of Concept:

An Arduino with a potentiometer to represent the velocity of the user and three vibration motors (tactors) to represent the color data.  These physical sensors and actuators are connected to p5.js code which uses a video camera and mouse cursor to select the point of a live video feed to extract color from. The color is then sent to the Arduino where it is processed and sent as output signals to the tactors. A switch is also available, when the user doesn’t want to have vibrations clouding their mind.

Fritzing Sketch:

The Fritzing sketch shows how the potentiometer and switch are set up to feed information into the Arduino as well as how the tactors are connected to the Arduino to receive outputs. Not pictured, is that the Arduino would have to be connected to the laptop which houses the p5.js code.

Proof of Concept Sketches:

The user’s velocity is sensed which alters the rate at which color is processed and sent back to the user through vibrations. The user data could also be extended to change the range of color sensing that is applied to the live feed to feel the colors of a general range.

Proof of Concept Video:

Files:

Physical Crit_Final

Visualizing Spaces

Problem:

For those who have impaired vision or are blind, understanding the quality and form of the spaces that they inhabit may be quite difficult to perceive (inspired by Daniel Kish’s TED Talk that Ghalya posted in Looking Outward). This could have applications at various scales, both in helping the visually impaired with way-finding as well as in being able to experience the different spaces they occupy.

A General Solution:

A device that would scan and process a space using sonars, LIDAR, photography, 3D model, etc. which would be processed then mapped onto a interactive surface that would be actuated to represent that space. The user would then be able to understand the space they are in on a larger scale, or on a smaller scale, identify potential tripping hazards as they move through an environment. The device would ideally be able to change scales to address different scenarios. Other aspects such as emergency situation scenarios would also be programmed into the model so that in the case of fire or danger, the user would be able to find their way out of the space.

Proof of Concept:

An Arduino with potentiometers (sonars/other spatial sensors ideally) to act as input data to control some solenoids which represent a more extensive network of physical actuators.  When the sensors sense a closer distance, the solenoids will pop out and vice versa. The solenoids can only take digital outputs, but the ideal would be more analog so that a more accurate representation could be made of the space. There are also two switches, one that represents an emergency button which alerts the user that there is an emergency, and one that represents a routing button (which ideally would be connected to a network as well, but could also be turned on by the user) which leads the solenoids to create a path out of the space to safety.

Fritzing Sketch:

The Fritzing sketch shows how the proof of concept’s solenoid are wired to a separate power source and is setup to receive signals from the Arduino as well as how all of the input devices are connected to the Arduino to send in data. The transducer for emergencies has been represented by a microphone, which has a similar wiring diagram. Not pictured, is that the Arduino and the battery jack would have to be connected to a battery source.

Proof of Concept Sketches:

The spatial sensor scans the space that the user is occupying which is then actuated into a physical representation and arrayed to create more specificity for the user to touch and perceive. This system would be supplemented by an emergency system to both alert the user that an emergency is occurring, and also how to make their way to safety.

Proof of Concept Videos:

Files:

Assignment_6_Final

Life of a Blind Girl

https://lifeofablindgirl.com/2018/05/16/21-things-i-couldnt-live-without-as-a-blind-person/

This blog post goes through some current technologies and products that are really helpful from the perspective of someone who is blind. It was really interesting to see the impact and potential of current technology in supporting those who are differently-abled in navigating through their daily lives with more ease and comfort. It’s also helpful to know what’s already out there and being done. I would definitely recommend taking a read through!

Anti-Slouch Machine

Problem:

As people interact more and more with technology, a problem with sedentary lifestyles is the toll it takes on people’s bodies. One way to alleviate these problems is to promote good posture while sitting.

A General Solution:

A device that would sense the angle of a user’s back and give feedback based on its interpretation of the user’s posture.

Proof of Concept:

An Arduino with an accelerometer to act as input data to control some transducers which represent a more extensive network of physical actuators.  When the accelerometer senses that the user is sitting up straight, none of the actuators move. When the accelerometer senses that the user is slouching or is leaning too far forward, the vibrations move in sequence to guide the user to lean forward or backward in the correct direction. If the user decides to not correct their posture, the device will eventually get to the point where it will just constantly vibrate unless the user fixes their posture. This occurs for a duration of time until the user’s ‘probation’ period has expired (expires when user doesn’t slouch for a while). When the accelerometer senses that the user is asleep, it vibrates gently, fading from nothing to a softer vibration to wake the user up.

Fritzing Sketch:

The Fritzing sketch shows how the accelerometer is setup to send information into the Arduino as well as how the transducers are connected to the Arduino to receive outputs. The transducers have been represented by microphones, which have similar wiring diagrams. Not pictured, is that the Arduino would have to be connected to a battery source.

Proof of Concept Sketches:

The transducer senses when the user is asleep, slouching, or sitting with good posture and sends a corresponding output to the transducers which vibrate to inform the user of how they are doing.

Proof of Concept Videos:

Demonstration

LED Demonstration

Files:

Assignment_5_Final

Jogging Partner

Problem:

For people who are listening to music while jogging, they may often times zone out and put themselves in danger as a result. Light is a great communication tool to warn others, as well as the jogger, to be careful and be aware of other people. In addition, when jogging alone, if something happens to the jogger, some people may not notice them unless something drastic happens.

A General Solution:

A device that would control a light that will react differently to the jogger’s actions based on various sensors to ensure that the light is required and to ensure that the light is being used in the optimal setting. The device could also address other issues like the speed of jogging which can be unintentionally changed or the issue of emergencies and attracting others’ attention in critical scenarios.

Proof of Concept:

An Arduino with potentiometers, switches, and a photoresistor to act as input data to control LEDs which represent a more complex visual interface.  When an emergency is detected by the sensors or reported by the user, the visual interface will try to attract attention through the use of the LEDs. On a less serious occasion, the device will detect if the user is running too slow or too fast. Lastly, if the user tries to turn on a flashlight to light their way, the device first considers how bright it is outside. If it is still bright out, the device will ignore the command, assuming the user pressed the switch by accident. If it is dimmer outside and the user switches the light on, it will blink to attract the attention of vehicles and bikes to make sure that they can see the jogger. If it is quite dark out, the light will turn on automatically, whether the user flips the switch or not.

Fritzing Sketch:

The Fritzing sketch shows how the various inputs are set up to feed information into the Arduino as well as how the LEDs are connected to the Arduino to receive outputs. Not pictured, is that the Arduino would have to be connected to some battery source.

Proof of Concept Sketches:

The sensors collect data which help to inform the user and embed intuition into the outputs, trying to provide a variety of features which make use of the information to improve the user’s jogging experience.

Proof of Concept Videos:

Prototype Full Demonstration:

Prototype Silent Demonstration:

Files:

Crit_Visual_Final