Final Project: Responsive Computer

Problem

“Until now, we have always had to adapt to the limits of technology and conform the way we work with computers to a set of arbitrary conventions and procedures. With NUI(Natural User Interface), computing devices will adapt to our needs and preferences for the first time and humans will begin to use technology in whatever way is most comfortable and natural for us.”

—Bill Gates, co-founder of the multinational technology company Microsoft

I think that gesture control interface could have great potential to help people interact with computing devices naturally because gestures are inherently natural. Gestures are a huge part of communication and they contain a great amount of information, especially the conscious or unconscious intentions of the gesture-doers. They sometimes communicate more, faster, and stronger than other communication methods.

General Solution

In this perspective, I want to design a gesture user interface for a computer. When people are sitting on a chair in front of a computer, their body gesture (including posture or movement) shows their intentions very well. I did some research and I could find a few interesting gestures that people commonly use in front of a computer.

When people are interested in something or when they want to see something more closely, they lean forward to see something in detail. Reversely, when people lean backward on a chair with two hands on their heads staring at somewhere, it is easy to guess that they are contemplating or thinking on something. When they swing a chair repeatedly or shake legs, it means that they are losing interests and become distracted.

The same gestures could have different meanings. For example, leaning forward means the intention to see closer when people are looking at images, but it could mean the intention to see the previous frame again when they are watching a video.

I am going to build a gesture interaction system that could be installed on computers, desks, or chairs to recognize the gestures and movements of a user. According to a person’s gestures and surrounding contexts (what kind of contents he/she is watching, what time is it, etc), the computer will interpret the gestures differently and extract implicit intentions from them. This natural gesture user interface system could leverage user experience(UX) of using computing devices.

I am also considering to add haptic or visual feedback to show whether the computer understood the intention of the gestures of the user as input.

Proof of Concept

The system is composed of two main sensors. The motion sensor is attached under a desk so that it could detect the movements of legs. The ultrasonic sensor is attached to the monitor of a laptop so that it could detect the posture of a user, like an image below.

The lean forward gesture could be interpreted differently based on the contexts and the contents that a user is watching at that time. I conducted research and found correlations;

  1. When a user is seeing an image or reading a document, they lean forward with the intention to look something closer or in detail.
  2. When a user is watching a video, they lean forward with a surprise or an interest, with the intention to look at the recent scene again.
  3. When a user is working on multiple windows, they lean backward for thinking for a while or see the overview of all windows.

Based on this intention and the contexts, the system I designed responds differently. For the first case, it zooms in the screen. For the second case, it rewinds the scene. For the last situation, it shows all windows.

Also, the system could detect the distraction level of a user. The common gestures when people lose interest or become boring are shaking legs or staring at other places for a while. The motion sensor attached below the desk could detect the motion and when the motion keeps being detected more than a certain amount of time, the computer turns on music that could help a user focus.

Video & Codes

(PW: cmu123)

codes

Crit 03: Playing piano through gestures(body movements)

Problem

I started with a problem area related to sound. For example, when a conductor is conducting an orchestra, how might they control the volume, tempo, or pitches through the gestures? However, this problem is pretty complex for me to tackle in a week, so I narrowed down to a more specific and small problem – how might we control the music through gestures. I came up with making a different sound based on the distance or proxemics so that we could play the piano through gestures.

General Solution

I come up with an idea that what if I could play the piano through gestures or body movements. Rather than playing the piano with fingers, I thought that there might be a way to enable people to play the piano using their bodies. While thinking about it, I thought an idea of mapping piano notes on the distances. By moving our arms and hands we could play with sounds.

Proof of Concept

In order to track the distance, I used the ultrasonic sensor and Arduino. When I go further, it wasn’t accurate enough to detect further distances. However, when it is close, I realized that it is pretty precise in distance. I tried to use a potentiometer to change the distance manually, but it didn’t work really well. Thus, I just focused on the ultrasonic sensor.

To control the computer, I used a terminal to run python codes. Through python codes, I could control the keyboards and mouse, which allowed me to play the online piano using keyboards.

Codes and Video

Jay_Crit_03_sound

Assignment 8: Recognizing gestures for traffic safety

Problem

I kept working on the Leap motion sensor for this week, too. The problem that I focused on was recognizing hand signals for transportation. When a road reconstruction is going on or a traffic accident happens, it is usual that normal people or police officers temporally control the traffic on the road. However, it is very dangerous that it causes additional accidents because sometimes it is difficult to notice signals from a far distance.

General Solution

I came up with multi-sensory feedback (for this assignment, auditory feedback) of hand signals on the road. The device that is attached to the front of a car could read the gestures of people on the road and make different sounds according to the signals. Thus, a driver could notice the gesture much precisely as well as easier and fast. I believe this system could help to prevent accidents in advance. Also, it could be attached to the autonomous car as well and the car could automatically read and react to the signals.

Proof of Concept

Because of the limited options in SDKs, I could just implement pinch and fist gestures to make sounds. For the fist gesture, I used two different sounds according to the time – the first fist and the second fist gestures sound differently. I thought that hand signals could be a series of movements, and this feature could also read them and make different sounds.

Video & Codes

Video-Jay_Assignment_08_gesture_sound

Codes-Jay_Assignment_08_gesture_sound

 

Assignment 07: Control(play/pause) music through gestures

Problem

For this assignment, I focused on gesture interaction – which is my current interest – and I thought that gesture interaction has powerful strengths compared to others. It is fast and quiet. Also, it overcomes physical distance so that we could control things without actually touching them.

I came up with a situation when we are listening to music through a speaker or watching TV, we become difficult to hear other sounds. When someone calls us or other situations that we have to urgently stop the music and focus on other sounds, it is sometimes really difficult to do it fast. If we are using a laptop to listen to music, we have to find the mute or pause button and then push it with our hands – which takes much more attention (sight, physical) and time. When we watching TV or listening to music through a smartphone, it will be similar.


General Solution

Thus, I decided to use gesture control to stop the music quickly and urgently. By raising a hand and making a fist, which I thought that it is intuitive for humans to stop something, users could pause the music. After it is paused, using the same gesture, they could play the music again.


Proof of Concept

In order to track the hand gestures, I used the Leap Motion sensor. It is really easy to pause and play music using simple gestures. I wanted to design new types of gestures to make the gesture interaction more natural and intuitive. However, other than the created database of hand gestures, it seems like it requires to create my own database, which was challenging to me.


Codes & Video

JAY_Assignment_07_gesture_sound

Critique 02: Assisting Individual Body Training through Haptics

Problem

When training our body and build muscles at a gym, it is really important to maintain the accurate and balanced pose and gestures – not only for not being injured but also for maximizing effects and keeping muscles balanced. However, when we go to the gym by ourselves, it is sometimes really difficult to reflect ourselves whether we are using the tools in the right way or not.

Solution

I thought about a device – that could be a smartphone with an armband or a smartwatch or the other devices that are attached to our arms – that provides haptic feedback to us so that we can keep the right pose.

For example, when we are doing push up on the ground, when we go down, the device checks the degree of our arms and time. After we go up and go down again, the device provides haptic feedback (various intensities of vibration) to signal to us that we have to go down a bit further to reach the right degree. When we reach that, it vibrates shortly, to let us know we did well.

Proof of Concept

I tried to code through Swift and use the gyro sensors in the iPhone. I believe I could develop this idea much further – it could check various degrees of our bodies when we are doing exercise, even stretching to increase our flexibility. Also, if it could keep collect data throughout the time, it will understand our capabilities of a certain exercise or part of our muscles, so that it could guide us to eventually increase our capabilities with the appropriate tempo without harming our bodies.

Videos and Codes

degreeChecker – code

 

Assignment 6: Balance Checker For Visually Impaired

Problem

I started with a few problems related to balance for especially visually impaired people. Firstly, when they are moving a pot with hot soup inside, it is really dangerous because they cannot check the balance of it. Another situation will be that, when they are building furniture, especially shelves, it is important to maintain the horizontal balance to keep things safe.

Also, for sighted people, there might be many situations when the balance is important. For example, when we are taking a photo.


General Solution

How might we use tactile feedback to let them feel the tilt or unbalanced things? I thought that the vibration with various intensity would be a great way to do it.


Proof of Concept

I decided to use the iPhone for two reasons. First, I realized that it has capabilities to generate a variety of diverse tactile feedbacks using different patterns and intensity, I found it useful to take advantage of its embedded sensors. Lastly, I thought that making a vibrating application will be useful to provide higher accessibility to many people.

I categorized three different groups of the degree to provide different tactile feedback in terms of intensity. When a user tilts the phone 5~20 degrees, it makes light vibration. From 21~45 degrees, it generates medium vibration. From 46~80 degrees, it generates intense vibration. Lastly, from 81~90 degrees, it vibrates the most intensely (just like when it receives a call). I also assigned the degree numbers to RGB code, to change the colors accordingly.


Video & Codes

 

Assignment #05 – Feeling the Memories/Photos with Space and People

Problem

Photographs are closely related to our emotions and memories. They remind of associations with places, people, and stories. Technology has made taking and managing photos much easier than before, but there are still some gaps.

One of the most precious resources of the modern household is time, and the effort to take care of all those wonderful photographs defeats their value. (…) Digital cameras change the emphasis, but not the principle. (…) Thus, although we like to look at photographs, we do not like to take the time to do the work required to maintain them and keep them accessible. Donald Norman (Emotional Design, 2003)

Thanks to the smartphone, we can always be accessible to the photos not only in the device but also in the clouds. However, it does not mean that we feel free from those efforts. We actually don’t see the photos that we have taken that much. I tried to think about how can I use tactile feedback for the emotions related to photos.

 


 

General Solution

Since photographs are closely associated with places and people, I thought that I could use these data sets.

Scenario #01 – An accidental encounter with my memories here

When I am passing by a location where I visited before and took photos, the phone alerts me with the vibrations in certain patterns and pop-up some related photos. According to the number of photos and/or the emotions related to them (happy, sad, nostalgia, etc), the patterns change.

Scenario #02 – My emotional connections to places

When I am planning to visit somewhere, and finding a place through map application. I can turn on the heatmap layer on the map, which shows the connections between my photos/memories/frequency of visits and locations. Also, when I touch a specific place on the map, I could feel the vibration patterns based on the number of photos/memories/frequency of visits and/or related emotions.

Scenario #03 – Memory reminder with people

I am planning to meet my friends. While texting and arrange a meeting with them (or extracting data from my scheduler), my phone automatically exposes the photos that I have taken with them or that are related to them in some ways, to remind me of the memories and stories with them.

 


Proof of Concept

To design the tactile signal for these features, I am trying to design vibration patterns. I used Swift and Xcode to use the haptic feedback features of the iPhone X. There are vibration, tactile feedback (success, warning, error), pressed (light, medium, heavy). I tried to design patterns based on them.

 


Video/Image

Critique 01: Visualizing Coming In/Out of Roommates (+ Status of Door)

Problem

When we are wearing headphones and listening to music in a loud sound, we become deaf, unable to listen the sounds from our environment. Sometimes, we cannot even notice people’s, especially the roommates’ existence. We become surprised at their sudden appearance or disappearance.

 

Solution

I tried to visualize the surrounding sounds and give feedback to the user. For this critique, I focused on the movement of the entrance door of a home. The status machine visualizes whether the door is open or not so that the user could notice his/her roommate is going out or in. Also, if the door is kept open for a long time, it gives feedback to let the user close and lock the door for the security purpose.

 

Proof of Concept

By using a potentiometer, I could detect the degree of the door when it is opened or closed. After a certain degree, the status machine decides the door is open and gives feedback to the user. When the door is kept open, it also lets the user know that the door is kept open for a while and enable the user to close the door. When the door is open and then closed, the status machine decides that someone comes in or out. It also gives feedback to the user so that he/she could notice this fact.

Fritzing

I also used P5.js to change the laptop screen.

 

Video/Codes

jay_door_status

 

 

Assignment 04: Visualizing Door Bell While Taking A Shower

Problem: 

Even though I do not have any problems with my hearing, I became a temporary deaf in some situations. For example, when I am listening to music through my headphone (especially, the one with noise-canceling) I cannot even hear my friend talking right beside me or my phone ringing. When I am taking a shower in the bathroom, I cannot hear someone is pressing the doorbell. I decided to focus on these problems.

 

General Solutions: 

I tried to sketch some situations that are similar and decided to design a system that could visualize sound in many ways. For example, I come up with using motors to change shapes or using vibrators to show the movement of water ripples (which could be also a type of visualization). Other than using lights, there would be various ways to visualize the sounds. I decided to focus on the situation that I cannot hear the bell ringing while taking a shower. For this situation, I realized that using lights would be the best way because there are not many things I could let users get along while taking a shower.

 

Proof of Concept:

The concept is pretty simple. When the bell is pushed, the intensity of the light of the bathroom slightly changes, so that a user could notice the change easily. I tried to not to use additional LEDs or lights inside the bathroom because it would be burdensome and have water-related problems. I tried to design the light slightly changes with specific patterns so that a user won’t misperceive the lighting of the bathroom is something wrong.

 

Arduino Code & Demo Video:

Jay_Assignment_04

I tried to use P5 JS Serial Control to visualize on my computer, but it didn’t work well. 🙁 I need some help to solve the problem! (and I hope to learn JS as well.)

 

Mini Assignment: Interesting State Machines

MacBook Charger Light

The first state machine I found is Macbook Charger light. It shows the states of the battery – whether it is charging or charged fully – using green and orange(red) colors.

Then, I tried to find non-electric state machines.

 

Restroom Door Indicator

I thought that this state machine is interesting because it does not use any electrical energy, instead, it uses mechanical structure to change the visualizations of the state.

 

Humidifier Remain Water

In terms of showing states of the machine, I thought that the water in a transparent humidifier could be also a state machine. Water itself shows how much water is left and how long the humidifier could work.

 

Parking Balloons

It is a great example of a state machine without any mechanical or electrical engineering. Using a very simple structure – a flying balloon and a string, this system lets drivers know where is an available parking space.