Final Project: Emotionally Intelligent Home Assistant

Abstract: Existing home assistants improve interaction in the home by using speech recognition and intelligent natural language processing. However, they lack the emotional intelligence needed to listen, motivate and support the user and provide support towards a healthy mind. My project proposes the use of a custom google assistant through AIY voice kit to detect emotion, provide social support and generate empathy in the user.

The project builds on existing technology and can potentially replace the current Google home assistant.

Hardware:

AIY Voice kit and a mobile browser


Systems Diagram:

DEMO:

References

 

Smart Desk

The goal of this project is to add interactive functionality to an adjustable standing desk. Countless times my environment interrupts my workflow when I’m at home. I use my room as a home office, and it is important that my housemates and friends know when I am on an important call and cannot be interrupted. At the same time, I often find myself getting carried away and spending hours in front of my task without moving at all. The small systems on this desk aim to both aid the user in time management, but also help remind them that it is healthy to stop and take breaks.

 

Three Main Systems

1.A ultrasonic sensor is visible under the table and will detect if a person is standing at the desk. If there is someone there, it triggers a neopixel light to turn on. If someone is standing there for more than 52 minutes, the neopixel turns red. This is meant to remind them to take a short break from the given task before continuing.

Note: the timer for the standing person is set to 5 seconds for demo purposes. If used for real tasks, the timer would be set for 52 minutes.

2. A task management system uses a RFID reader and RFID credit cards to set timers on the users task. The user swipes a box across the reader and a timer is set. This timer is visualized by the LEDs inside of the box turning on. When the time is about to end for the given task, the LEDs will blink before turning off. The user is only permitted to light up one task at a time, so they have a visual reminder of which task they should be working on. I wanted to use color-blind friendly colors, but our lab ran out of led colors.

Note: the timer for each task is set to 5 seconds for demo purposes. If used for real tasks, the timer can be set to the time the user desires to spend on each task.

3. A led, ideally placed outside of your office door, is used to tell your housemates or if it is okay for them to come in and interrupt you or not. The switch would be placed on the desk and the user will choose to set it to green or red.

Actual Project:

Ultrasonic sensor

RFID task organizer

RFID Reader Wiring

Inside card box

Busy/free switch 

 

Smart Desk 

Future Work

While my desk was a simple prototype of a potential product, there are a few future ideas that resulted from the creation process.

  • Using the task box light system in an open office space to keep your team organized. Oftentimes, teams are working on multiple projects in parallel and this system could be used to visually communicate which project each individual is currently working on.
  • Add a temperature controlling functionality (a fan that automatically blows when the room reaches a certain temperature)
  • Expand the neopixel light system so it can be used as the main source of light while working. This will make the “take a break” alert more prominent and harder to avoid. I also want to make the alert blink as a warning before fully turning red.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Vinyl & controller setups are large and expensive.

 

 

Two audio track crossfade without beat/tempo match.

 

 

interactive textile interface – acrylic and silver-nano inks on polyester substrate.

 

 

Bluno Nano and CapSense arduino library.

Bluno can easily connect arduino sensors to Android & iOS. Although, this project is not serially connected to Unity on Android, the example code connects to an app created in Android Studio and the Bluno connects to Unity on iOS.

This program uses an initialization stage to calculate a baseline for each screen
printed touch sensor, then uses a multiplier to calculate a touch threshold. To
increasethe correctness of the sensor data, I will implement a touch calibration
step into the setup sequence.

#include 

// infinEight Driver
// Ty Van de Zande 2018

/*
 * CapitiveSense Library Demo Sketch
 * Paul Badger 2008
 * Uses a high value resistor e.g. 10M between send pin and receive pin
 * Resistor effects sensitivity, experiment with values, 50K - 50M. Larger resistor values yield larger sensor values.
 * Receive pin is the sensor pin - try different amounts of foil/metal on this pin
 */


// Arcitecture
// TBD

static int IN1 = 18;
static int IN2 = 17;
static int IN3 = 16;

static int ledGROUND = 23;
static int ledONE    = 21;
static int ledTWO    = 20;
static int ledTHREE  = 19;


int SENSE1;
int SENSE2;
int SENSE3;

long THRESH1;
long THRESH2;
long THRESH3;

float mult = 1.7;



void setup()                    
{
   pinMode(ledONE, OUTPUT);
   pinMode(ledTWO, OUTPUT);
   pinMode(ledTHREE, OUTPUT);
   pinMode(ledGROUND, OUTPUT);
   
   
   Serial.begin(9600);
   Serial.println("Prepping");
   initializeSensors();
}

void loop()                    
{
  updateSensors();
  //printSensors();
  digitalWrite(ledONE, LOW);
  digitalWrite(ledTWO, LOW);
  digitalWrite(ledTHREE, LOW);
  areWeTouched();     
  delay(10);                    
}

void  areWeTouched()
{
  if(SENSE1 > THRESH1 || SENSE1 == -2){
//    printSensors();
    digitalWrite(ledONE, HIGH);
      Serial.println("3");
  };
  if(SENSE2 > THRESH2 || SENSE2 == -2){
//    printSensors();
      digitalWrite(ledTWO, HIGH);
      Serial.println("2");
  };
  if(SENSE3 > THRESH3  || SENSE3 == -2){
//    printSensors();
      digitalWrite(ledTHREE, HIGH);
      Serial.println("1");
  };
}



void printThresh(int one, int two, int three)
{
  Serial.print(one);
  Serial.print(" . ");
  Serial.print(two);
  Serial.print(" . ");
  Serial.print(three);
  Serial.println(" ");
  
}



void updateSensors()
{
    SENSE1 = touchRead(IN1);
    SENSE2 = touchRead(IN2);
    SENSE3 = touchRead(IN3);
    // Array not working???
//     int SENSESTATES[] = {SENSE1, SENSE2, SENSE3, SENSE4};
//     int lisLEN = sizeof(SENSESTATES);
//     for(int i = 0; i < lisLEN; i++){
//        Serial.print(i);
//        Serial.print(":  ");
//        Serial.print(SENSESTATES[i]);      
//     }
}

void printSensors()
{
    Serial.print(SENSE1);
    Serial.print(" . ");
    Serial.print(SENSE2);
    Serial.print(" . ");
    Serial.print(SENSE3);
    Serial.println(" ");
}


void initializeSensors()
{
  int cts = 104;
  //int mult = 20;
  
  long temp1 = 0;
  long temp2 = 0;
  long temp3 = 0;

  for(int i = 0; i < 20; i++){
    updateSensors();
    //printSensors();
  }
  
  Serial.println("Collecting Summer Readings");
  for(int i = 0; i < cts; i++){
    if (i % 4 == 0) { Serial.print("|"); }
    updateSensors();
    temp1 += SENSE1;
    temp2 += SENSE2;
    temp3 += SENSE3;
  }

  Serial.println(" ");
  Serial.println("Averaging thresholds");
  THRESH1 =  mult * (temp1 / cts);
  THRESH2 =  mult * (temp2 / cts);
  THRESH3 =  mult * (temp3 / cts);
  printThresh(THRESH1, THRESH2, THRESH3);
  printThresh(THRESH1/mult, THRESH2/mult, THRESH3/mult);
  Serial.println(" ");
  digitalWrite(ledONE, HIGH);
  delay(80);
  digitalWrite(ledTWO, HIGH);
  delay(80);
  digitalWrite(ledTHREE, HIGH);
  delay(80);
  Serial.println("Ready!");
  
}


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Thank you to people who helped, and others!!!!
Golan Levin
Claire Hentschker
Zachary Rapaport
Gray Crawford
Daiki Itoh
Lucas Ochoa
Lucy Yu
Jake Scherlis
Imin Yeh
Jesse Klein
Dan Lockton
FRFAF
URO-SURF

Final Project: Running Companion!

This project used a blob code to animate a character that runs with you. By wearing bands of a certain color, the computer can determine what pace goal you want to set, and how fast you’re running. Using the difference between these two paces, it determines the location of the animal companion on the screen. If the pace is slower than the pace goal, then the animal will move to the left of the screen, as if it is running faster than you. If the pace is faster than the pace goal, the animal will move to the right of the screen, as if it is running slower than you.

The amount of bands that you wear determines what animal companion you get, and what your pace goal is. The animal runs at the pace goal.

This sort of pace-keeper allows for runners to be more conscious about their cadence, and makes fixing their cadence easier.

 

The project in use:

At the final presentation:

Here’s my final code, you have to open it in processing:

BlobsforanimalsforShow

Proposal

Title: InfinEight

Summary: An interactive rollup phone case that can be used to mix and mashup music.

Descriptive paragraph

The goal of this project is to design a fabric-based user interface that controls media on a phone. A phone is an incredibly fast processor and has access to millions of songs, but the screen is small and not usable for actions that incorporate a persons’ whole body. I will be designing an interactive electrical sensor system, boolean capacitive sensing to receive input. The interactions will be sent to my phone using a bluetooth chip to control a Unity Application.

YouR PLAN

How are you going to complete this project?  It can be an outline.

  1. Write a program for the Bluno Nano to send boolean capacitive values to my phone.
  2. Finalize interface design, and screen print conductive ink
  3. Unity visual of capacitive buttons being touched
  4. Use Unity to control music playback.
  5. If Unity does not work, I will use p5.js to write a program.

Materials Needed

Bluno Nano is the main additional component. I will additionally be screen printing the design.

Deliverables for Show

I will need the product, my computer, and my phone. Additional speakers would be nice, but are not necessary.

MEDIA for Show

Poster describing interaction, so people can play with the design!

Proposal – Contextual Living Room Table

Title:

Contextual Living Room Table

Summary:

Generating interactive 3D contents on a living room table using AR and a haptic puck.

Description:

The living room tables have rich stories behind it – conversation with family, studying for a homework, eating dinner, kids playing with LEGO, and so on. What if these tables in a living room could provide us interactive and engaging contents such as weather forecasting and entertainment for kids? This project allows the user to engage with the AR information and artifacts that are coupled to a physical desk through a haptic interface that consists of an array of small vibrators and a heat pad. The goal is to design text-less and immersive AR interaction techniques that enrich everyday living room table experience.

Process:

  1. Test BLE on Unity and iOS
  2. Test IR tracking using a webcam (or depth camera)
  3. Battery check (drive 4 vibrators and BLE board)
  4. Get all tracking, wireless, and actuation working
  5. AR contents programming
  6. 3D print a package for haptics
  7. Prepare a table and a webcam installation

Important tools:

  1. BLE board
  2. heating pad
  3. IR emitter
  4. LiPo battery (x5 for backups)
  5. webcam
  6. webcam setup tools

Deliverables for show:

Space for a table, PC, and a webcam installed on top of table

Media:

Demo video

An interactive music box

An wireless music player with physical interaction to minimize distraction.

DESCRIPTIVE PARAGRAPH

Why a music box?

I like to put music on while i’m working, however in this digital age, the “recommendations” and changing between screens can be quite a distraction. A music box with physical interaction should be able to minize that as much as possible, while still kept the interaction entertaining.

How it works:

The music box would come with figurines with RFID tags. When different figurines are put on top of the music box,  different sets of music would play. Ideally, if it can be achieved, the user can put their own music album and make their own figurine.

 

YOUR PLAN

The final project would be based on the previous assignment where I used touch sensor and Arduino Uno with a simple p5js interface. Here is the basic outline of the project:

  1. Incorporate/test RFID onto the project
  2. Connect speaker to the box instead of using laptop
  3. Refine the music library
  4. Refine the p5js interface
  5. Add the feature where user are able to make their own input
  6. Switch Arduino Uno to a Particle Photon
  7. Add lipo battery and battery shield/battery baby sitter
  8. Make a shell with laser cutted plywood

MATERIALS NEEDED

  1. Arduino Uno/Particle Photon
  2. RFID Reader chip and RFID tags
  3. Wifi base (I might be able to just use my laptop)
  4. Plywood for laser cutted bounding box
  5. Music Pieces (20s-30s per sample)
  6. Lipo battery
  7. Battery shield/baby sitter
  8. Some LEDs for indication
  9. Some buttons for play buttons etc.
  10. A cheap speaker should workout

DELIVERABLES FOR SHOW

On the day of the show, I would need a outlet for charging the battery, and a wifi base, which potentially could be my laptop. The speaker won’t be too loud so it should be ok to be somewhat close to other projects.

MEDIA FOR SHOW

I probably would put up a slide show on my laptop or some print out handouts/flyers

Proposal

Anxious Animation
Do you ever find yourself tapping your fingers on your arm or legs, out of boredom, anxiety, anger? This projects aims to convert said anxious behavior, tapping your fingers, into a more tangible output. While it may not reduce the stresses that cause us to fidget, perhaps a slightly more positive outlook could be achieved. This project doesn’t exist to fix a problem, but rather to reduce it or reframe it. This project would be considered a success if people were able to view their anxious habit as rewarding, via sound or animation, rather than as a result of negative outward effects. Form may depend on how the user taps. Perhaps they tap, but maybe they swirl their finger in circles? Final animation and sounds will be influenced by meditative imagery or sound, symmetrical and calming.
Plan:

create wearable patch that can live on cloths and upload to computer
• flora, lilypad, or particle
• capacitive touch board
• eeprom or some type of data storage for taps
• fabric
• battery

p5js or processing sketch to output the data collected
animation
• controls that influence color
• shape
• size?
sound
• tones to match taps
• controls for pitch, tone, etc.

manufacturing
• people need to be able to tap
• plug into computer
• select sound or animation
• “clear data”?

Materials:

• Flora, lily pad, particle
• capacitive touch capabilities
• eeprom or data storage of some kind
• fabric
• batteries
• conductive fabric or thread

Time:

• May need to order some things such as extra storage and conductive fabric/ink
• Leave time to program with data in p5 or processing

Deliverables:

• Need a desk for laptop and potentially patch
• Perhaps I have a jacket or lap pad that people can actually wear to tap
• Power

Media for Show:
I may want an additional flyer or poster

Monster Breath
Anxiety can consume me, making it difficult to breathe. While I have meditation and breathing exercises, I often feel I do them ‘wrong’ or am too stressed to even remember them. This project is a desktop friend that lives on your workspace, detects when you need to calm down, and starts to breathe. You can see the monster’s breath rising and falling, prompting the user to follow along with the subtle breathing creature. You can set, perhaps based on heart rate, how long the breathing lasts. Perhaps the silly form of a friendly monster will help reduce anxiety in addition to the meditative breathing. Research may include different types of breathing techniques and may influence final form. For example, 5 seconds in, hold, 7 second out, or breath counting. Lights may be involved to help guide the user through the exercise. But depending on the breathing exercise, maybe the monster has one eye..or three, form decisions like that may come after more research.

Plan:

create a monster of sorts, friendly of course
• find fabrics that allow for movement and fur like look
• figure out heart rate detection? perhaps through fitbit or other existing wearables?
• catch attention of stressed user (light? sound? haptic buzz?)
• test, bit by bit each step of the interaction
• detection of HR
• breathing
• turning on and off muscle wire
• completion phase
• maybe monster smiles
• eyes that light based on touch?

Manufacturing:
• design monster
• figure out power relay

 

Materials:

• Arduino or particle or two
• HR api, sensors
• fabric
• muscle wire
• power relay
• lights

Time:

• May need to order some things
• muscle wire, HR sensor?, power relay
• time to..
• build/construct monster
• set muscle wires and insulate
• solder

Deliverables:
  • Need a desk for laptop
  • depending on monster’s form, somewhere to put him
Media for Show:
  • I may want an additional flyer or poster

Proposal: Pace Yourself

Title: Pace Yourself

Summary: An projected animal companion assists you on your runs and logs your pace.

What is it

This projection is meant to be an entertaining way to help runners keep their pace. As an ex-cross country runner, I often find myself trying to run at a pace I can’t maintain anymore. Last year, I got shin splints halfway through training for a half marathon because I was pushing myself too hard. In cases like mine, or for runners who are training alone, without others to help set the pace, this projection helps the runner set goals and gives them a visual pace-keeper.

This allows runners to increase their pace in a constructive manner by analyzing their step per minute count for every minute of their run, and comparing it to their previous runs and their long-term target pace. The mechanism can then suggest a target pace for their next run. The visual projection of a running animal can allow the runner to keep their short-term target pace in a simple way, by keeping up with their animal friend.

PLAN

  1. draw a couple of running animal animations, for different paces
  2. use geolocation tracking to determine pace (on run)
  3. code pace graph viewer (3-line graph) (off run)
  4. code short-term pace goal calculator (off run)
  5. animal selector based on short-term goal pace (on run)
  6. sync animal run pace and short-term goal pace (on run)
  7. code animal noise for excessive pace deviation (on run)
  8. combine on-run and off-run modes — based on device motion
  9. adjust for phone use and display
  10. find a portable projector
  11. animal projection
  12. create armband including projector and phone holder

MATERIALS NEEDED

This project is meant to be lightweight and useable for high intensity physical activity. I would need a portable projector and materials to create an armband that could carry the projector and a phone while still allowing the animal image to be projected and the phone screen to be visible (fabric, needle, thread).

DELIVERABLES FOR SHOW

What I need: Place for laptop, dark hallway or area to walk around in (so people can see the projection)

MEDIA FOR SHOW

Poster or storyboard to explain functionality of the equipment and/or wifi connection instructions so that people can work the basic non-projection code on their phones.