Final documentation – Intro to Physical Computing: Student Work Fall 2022 https://courses.ideate.cmu.edu/60-223/f2022/work Intro to Physical Computing: Student Work Tue, 20 Dec 2022 23:43:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.0.8 Back-up Alarm by Team Fornax: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/back-up-alarm-by-team-fornax-final-documentation/ Sun, 18 Dec 2022 05:11:51 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16847 For this project, we worked in teams of three to design a device that would be useful for a person living with a disability. Each team worked alongside a client from Community Living and Support Services (CLASS) to create something what would be relevant and useful in their lives. Our client was Jeff Owens, an individual with a mobile disability. Through the course of this project we would conduct an interview with Jeff and incorporate his feedback into the final product. To read more about our interview with Jeff, click here.

What We Built

Our end product for Jeff was a device that he could strap onto the bottom of his wheelchair using Velcro strips. This device would help ensure that he would not back up into/onto the objects behind him. Jeff gets in the range of an object behind him the device would beep until he is a reasonable distance away from the object. In additional, the device includes lights that would flash on and off at the same time. Jeff has total control of this device, he has the ability to turn off the device entirely or just the beeping. Similarly, Jeff can adjust the range in which the device starts beeping at.

This is the main photo of the device with all its parts. All the velcro is unstrapped in this photo and there is a loose LED strip in the middle that is connected to the control panel.

This is the control panel close up. It contains a switch that turns off sound (it has sound labeled underneath), and it has a switch that turns off lights and sound (has lights labeled underneath). The switches are like this because we wanted Jeff to have the ability to stop lights and sounds up at the control panel where he can reach. The blue knob is a potentiometer for adjusting the distance the sensor scans for.

This is one of our ultra sonic sensors that is suppose to be strapped to the behind lower part of wheelchair with its one other counterpart. As you can see, you mount this by strapping the velcro tightly around the wheelchair rods.

This is a picture showing what the back of everything looks like, and it’s suppose to show you how you would strap the velcro if it was on the wheelchair.

This is the device on Jeff’s wheelchair at the critique. We didn’t get much time with it but we found it mostly worked except for attaching the underneath box with the Arduino. We must have incorrectly measured on prototype day because the velcro straps were not quite long enough. Otherwise it was kinda successful!

 

Narrative Sketch

After recording another album review for this YouTube, Jeff decides to take a quick nap in his wheelchair. Unknowingly to Jeff at the same time as he begins to drift away his two-year old nephew decided to take out all his toys and begin to play with them all around Jeff. Very quickly Jeff’s nephew gets bored of playing inside and decides to go outside, leaving all his toys still on the ground around Jeff.

After Jeff wakes up from his nap, he realizes that he is surrounded by all his nephew’s toys. Jeff turns on the back-up alarm. Jeff waits and does not hear a beep. He now knows that there is a safe path behind him that he can take to get out of the living room. Jeff has saved himself from getting hurt from accidentally tipping over and has saved himself from hurting his nephew’s feelings by accidentally breaking one of this toys.

How We Got Here

Prototype

The prototype we created was designed to answer the question: How can we help Jeff move safely from Point A to Point B?

Our prototype was more on the simpler side as we wanted to gauge Jeff’s opinion on our approach. The prototype took the shape of a rectangular cardboard box with optical proximity sensors poking out of the frontside. In addition, we mocked up a control panel on paper for Jeff to take a look at. When we showed our prototype, we focused a lot of explaining how device would interact upon certain actions.

This is us testing which sensors to use for our project. We ended up going with the ultrasonic sensor because it had a larger range it detected which would help minimize problems later we suspected. We brought this set up to the prototype critique but the design and wizard of oz prototypes ended up helping us communicate better with Jeff.

This was the underneath box prototype, we talked about how the sensors may end up on the side of this box and be put on the lower half of the wheelchair, however, after looking at Jeff’s wheelchair we decided this wasn’t going to happen.

This is the paper control panel mockups. We had Jeff put his finger on the imaginary buttons to see what was big enough. We also held it next to his arm rest and asked which size was most legible. We found the biggest one was the only viable option after these two tests.

Wizard of Ozzing of the LED Strip

This is what the serial monitor looked like when we were testing the distance sensors before the prototype session.

Here you can see us ideating after Jeff’s interview, trying to get an idea for the prototype critique. We struggled with this because we found Jeff didn’t have any complaints about his daily life. So we tried fleshing out/expanding all three of our ideas and talked to Zach about it. We ended up telling Jeff the problem space we picked but let him decided if he liked this route or not.

The main helpful info we got from the prototype session other than a solid direction for our project was a lot of measurements. This is the main page of notes that documented our measurements through a picture and writing. We ended up referencing this image a lot in the process.

In our mind, our device would mount underneath Jeff’s wheelchair and help detect objects in front of him. From the Prototype Critique, we gained valuable feedback that helped patch up holes from our initial interview with Jeff. Moreover, the Prototype Critique redefined the purpose of our device. Instead of detecting objects in front of Jeff, that he can already see, we learned that it was much more helpful if the device could detect objects behind Jeff as he backs up. This led us to separate the optical proximity sensor from being embedded within the device. We decided to move these sensors to these two metal rods already pointing from the back of Jeff’s wheelchair because these rods were already at an angle.

At the end of the Prototype Critique, we decided to acknowledge and include all the feedback we received from Jeff and the other CLASS clients. As people who do not have much experience with wheelchairs, we believed it was in the best interest to consider all the information we were given as our client knows himself the best. Furthermore, we are making this device for Jeff, hence his opinion would help us finalize our ideas for the final iteration. The main problem we encountered while prototyping was we found that getting the distance sensor accurate was pretty difficult because of angle and the sensor moving around, thus explaining why we wizard of oz the prototype instead.

Process

This is the original code flow chart. This was used to help organize the code initially and figure out how we wanted the information to flow through the device.

This was all our wiring before soldering. This was definitely a lot to tackle and difficult to read when initially soldering.

Results from calibration. This is what we documented when testing the potentiometer. We needed to figure out the highest the calibration should go.

This was the first prototype of the control panel. We ended up deciding the velcro should only strap underneath, because this was extremely unstable and wouldn’t really stay up. We also decided not to do a second potentiometer, and decided to enlarge the hole to a square so the potentiometer was easier to pinch.

This was us testing the ultrasonic sensor holder. We found it to be a little wide and we decided we want a hole to embed the sensor instead of have it sit on top or the wood.

This was us realizing two things. We made a mistake wiring the switches, and the bread board was not fitting in the control panel neatly. This lead us to a lot of re-soldering.

Through this project, we ran into several roadblocks. Firstly, after we had decided on the project, we struggled with deciding exactly how we wanted to go about the user feedback. We had to make some executive decisions on using lights and a buzzer and forgoing vibration. We then had some issues getting our buttons wired, but we then figured out how to get them to work because we found that we accidentally wired the incorrect holes in the soldering board. Afterwards, we then ran into some issues with exactly how we were going to mount the ultrasonic distance sensors on the wheelchair. We realized that the sensors could have easily been bumped, throwing off the distances. We then had to add a potentiometer, though we already had a working setup and working code. After, we ran into several issues with soldering, with the solder on the LEDs easily falling off. This caused several issues, but at the end of the day, we ended up with a working product.

A lot of the issues we ran into were found the night before. This was because we diverged a bit with our projected Gantt chart. We spent the class period before the project’s due date still working, since we had no sense of urgency, and put in little time outside of class. Instead of soldering and assembling the product, which is where we ran into the majority of our problems, we continued trying to refine our code and hardware. If we had dedicated that class period to assembling, we would have had more time to debug, and more time to dedicate to details.

Planned Gantt Chart

Conclusions and Lessons Learned

Our group had a great time with this project and we were happy with the product we produced during the final critique day. That being said, there is always room for improvement, and we received a lot of helpful feedback that we could have implemented, feedback that could have improved our project. 

For example, we received multiple critiques on our non-shrink-wrapped wires, such as “exposed wiring needs shrink wrap,” and “The wires […] would benefit from more protective covering as well”. We thought this was a good critique, as it was one way that our project could have been elevated to the next level. During the final build stage, we were considering adding the shrink wrap to our project, but we ultimately ran out of time. Another critique was that a stable mounting system would have helped a lot with our project, and I am very inclined to agree. The problem that we encountered was that we didn’t have access to Jeff’s wheelchair, and a more complicated mounting system would have taken up a lot of time, and been very difficult to create in the scope of our class. We ended up using velcro for mounting, since it was the simplest option, but I do agree that with a more stable mounting system we would have been able to make the product a lot more effective. 

There was also feedback that complimented our use of the wheelchair, and considering the situation that Jeff himself was in. “[T]he group did a nice job figuring out how to make the device fit on his chair best”. The idea of personalization vs. generalization was a critique that was discussed a lot in person as well. We tried to make it very clear that the goal of this project was not to make a manufacturable product, but to make a project for Jeff himself. That being said, I think some of our guests were excited about the possibility that our product had if we were able to generalize it a bit. I think they appreciated how many people our product might have been able to help, which is a critique that is easy to accept. Another positive critique that we received was that the user feedback options that we had were good choices. We got multiple compliments on the visibility and clarity of the lights, but then one guest commented that for the visually impaired, the idea of the buzzer for sound feedback was also a good option. We appreciated this critique because it took us a while to decide what might be the most effective method of user feedback.  

Working with a client with a disability was a good experience for us. We had to be careful about how we worded interview questions, and it was a bit hard to communicate with Jeff, but we managed to find aspects of his life that we could improve and build our project on. In order to do that, we had to dive fairly deeply into what a day in his life looked like. I don’t think there was anything we would have done differently. We tried to be as open as we could when it came to communication with our client, and I think we did a fairly good job, even though our client was not as responsive as we might have hoped. 

I think all of our group members had a good time with this project. The diversity of backgrounds and skills that we brought into this project helped it run smoothly. We learned how to make things, not just for ourselves or for this class, but for other people. Something that really stuck with me was the impact that our work had on so many people. There were multiple clients talking about how important and life-changing the work we were doing was. Though we’ve only been through one semester of this class, we were able to see the applications of our knowledge in a way that was very fulfilling and meaningful.

Technical Details

Electronic Schematic and Block Diagram

Electronic Schematic

 

Block Diagram

Code

/**
 * @title Back-up Alarm
 * @brief A useful device designed for Jeff
 * 
 * 60-223: Introduction to Physical Computing
 *
 * The following code initializes two optical proximity that serve
 * as eyes of the back of Jeff's wheelchair. If Jeff backs within
 * a certain distance of object, the led strip will light up and
 * the buzzer will buzz. The code also gives Jeff the freedom to
 * adjust the upperbound distance using a potentiometer. 
 *
 * @authors Ethan Lu <ethanl2@andrew.cmu.edu>
 *          Frances Adiwijaya <fda@andrew.cmu.edu>
 *          Gia Marino <gnmarino@andrew.cmu.edu>
 *
 * @mapping
 *  Arduino Pin |   Role   |   Description   
 *  ------------|----------|-----------------
 *      A0         INPUT    Potentiometer
 *      3          INPUT    Buzzer Control Button
 *      4          INPUT    Device Control Button
 *      5          OUTPUT   Buzzer
 *      9          INPUT    ECHO Pin for Right Sensor
 *      10         INPUT    TRIGGER Pin for Right Sensor
 *      11         INPUT    ECHO Pin for Left Sensor
 *      12         INPUT    TRIGGER Pin for Left Sensor
 *      13         OUTPUT   LED Strip
 */

/** @brief Import libraries */
#include <NewPing.h>
#include <PololuLedStrip.h>
#include <assert.h>

/** @brief Declare constants */
#define POTENTIOMETER_PIN         A0

#define BUZZER_CONTROL_PIN        3
#define CONTROL_BUTTON_PIN        4
#define BUZZER_PIN                5
#define RIGHT_ECHO_PIN            9
#define RIGHT_TRIGGER_PIN         10
#define LEFT_ECHO_PIN             11
#define LEFT_TRIGGER_PIN          12

#define LED_COUNT     60
#define MAX_DISTANCE 200
#define MAX_BUZZ       8

/** @brief Debugging macros */
#define requires(expr) assert(expr)
#define ensures(expr)  assert(expr)

PololuLedStrip<13> led_strip;
NewPing sonar_left(LEFT_TRIGGER_PIN, LEFT_ECHO_PIN, MAX_DISTANCE);
NewPing sonar_right(RIGHT_TRIGGER_PIN, RIGHT_ECHO_PIN, MAX_DISTANCE);
rgb_color colors[LED_COUNT];

unsigned lowerbound = 10;
unsigned upperbound =  0;

void fill(uint8_t r, uint8_t g, uint8_t b);

/**
 * @brief Declare pin modes
 */
void setup() {
  pinMode(RIGHT_ECHO_PIN, INPUT);
  pinMode(RIGHT_TRIGGER_PIN, INPUT);
  pinMode(LEFT_ECHO_PIN, INPUT);
  pinMode(LEFT_TRIGGER_PIN, INPUT);

  pinMode(CONTROL_BUTTON_PIN, INPUT);
  pinMode(BUZZER_CONTROL_PIN, INPUT);
  pinMode(POTENTIOMETER_PIN, INPUT);

  pinMode(BUZZER_PIN, OUTPUT);
}

/**
 * @brief Main routine
 */
void loop() {
  delay(100);
  upperbound = map(analogRead(POTENTIOMETER_PIN), 0, 1023, 60, 150);

  cleanup();
  if (digitalRead(CONTROL_BUTTON_PIN) == HIGH) {
    unsigned int left_distance = (sonar_left.ping() / US_ROUNDTRIP_CM);
    unsigned int right_distance = (sonar_right.ping() / US_ROUNDTRIP_CM);

    /** Too close to an object */
    while ((lowerbound < left_distance && left_distance < upperbound) || (lowerbound < right_distance && right_distance < upperbound)) {
      fill(255, 0, 0);
      if (digitalRead(BUZZER_CONTROL_PIN) == HIGH) {
        buzz(min(left_distance, right_distance));
      }
      left_distance  = (sonar_left.ping() / US_ROUNDTRIP_CM);
      right_distance = (sonar_right.ping() / US_ROUNDTRIP_CM);
    }
  }
}

/**
 * @brief     Assigns a new rgb value to every element in the color array
 * @param[in] r Amount of red
 * @param[in] g Amount of green
 * @param[in] b Amount of blue
 */
void fill(uint8_t r, uint8_t g, uint8_t b) {
  for (uint16_t i = 0; i < LED_COUNT; i++) {
     colors[i] = rgb_color(r, g, b);
  }
  led_strip.write(colors, LED_COUNT);
}

/**
 * @brief     Activate the buzzer
 * @param[in] distance
 * @pre       `distance` is non-negative
 * @pre       `distance` is less than `MAX_DISTANCE`
 */
void buzz(unsigned long distance) {
  requires(distance < MAX_DISTANCE);
  int x = MAX_BUZZ - int_log2(distance);

  warn(x);
}

/**
 * @brief     Calculate log2 of an integer
 * @param[in] x
 * @return    log2(`x`)
 * @pre       `x` is non-negative
 */
int int_log2(int x) {
  requires(-1 < x);

  int c = 0;
  if (x == 0) return 1; 
  while ((x >>= 1)) { c++; }
  return c;
}

/**
 * @brief     Run the buzzer a number of times
 * @param[in] x The number of times
 * @pre       `x` is non-negative
 * @pre       `x` is less than or equal to 7
 */
void warn(int x) {
  requires(-1 < x);

  for (int i = 0; i < x; i++) {
    digitalWrite(BUZZER_PIN, HIGH);
    delay(50);
    digitalWrite(BUZZER_PIN, LOW);
    delay(50);
  }
  delay(500);
}

/**
 * @brief Turns the buzzer and led strip off
 */
void cleanup() {
  digitalWrite(BUZZER_PIN, LOW);
  fill(0, 0, 0);
}

Design File

Rhino file that was used to laser cut all the pieces of this device.

]]>
Foot Controlled MIDI Instrument by Team Andromeda: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/foot-controlled-midi-instrument-by-team-andromeda-final-documentation/ Fri, 16 Dec 2022 21:22:17 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16970 For our final project, our group was given the opportunity to work with a client with a disability from CLASS (Community Living And Support Services) to develop a personalized assistive device that would improve their lives in some aspect. Over the course of seven weeks, we were able to brainstorm, design, prototype, and assemble a device tailored particularly to our client’s wants and needs, Teri Owens. Specifically, drawing from her love of music which we discovered during our interview, we wanted to add onto the existing list of assistive musical instruments by creating a MIDI controller that would be controlled by certain foot movements to produce sound electronically. For more details and interview documentation with Teri, please visit this page.

What We Built

Our product is essentially a MIDI (Musical Instrument Digital Interface) controller that is controlled by different foot movements which is purposefully meant to be placed on the foot plate of our client’s wheelchair. In our case, there is a roller which is used to change the pitch of a note or the percussion instrument and two foot pedals. One of the foot pedals makes a note sound while the other makes a percussion sound. This device is then connected to a computer via USB where the user can operate on a free MIDI software site and select different tracks to change the sound that is amplified from the speakers.

 

Front

Side

Back

Angle

Dollar as Size Reference

Testing Our Final Design

Narrative Sketch

It is finally Friday morning, and Teri opens her eyes with excitement as she comes to the realization that today is her Spring Concert. Without waking the rest of the house by not connecting the MIDI foot controller to her tablet, she practices all the movements including tapping and rolling from memory, each one with its own particular level of precision following a specific beat. While moving her whole body to the beat, she smiles as she knows she has perfected her part of the percussion and notes just like she had practiced a million times beforehand.

Before she knows it, she is on stage in front of an audience where she is able to spot familiar faces. With a million thoughts running through her mind only thinking about her part of the concert, she plugs the USB cord into her tablet to get the MIDI controller started and selects the track: House 05. As she hears the other percussion instruments start and a five second pause, she knows it’s her turn to start playing as the other percussionist starts. By using her feet to move the roller in the direction away from her, she changes to a higher pitch. As she focuses on only stepping the right pedal, she plays the drums to the perfect and and steps on the left pedal to play the notes exactly from memory. After all her steps and rolls, she waits patiently for her next part to come up: a solo.

How We Got Here

Prototype

This prototype was designed to help answer the design question: How can we create another assistive musical instrument for Teri to be able to play? Our prototype was a fitted cardboard structure fitting of our client’s plate with five different components: a foot pedal, distance sensor, pressure sensor, and two differently shaped rollers of varied height and width. The cardboard structure was held up with six metal beams for stability.

Final Prototype

Electronics/Interior of Prototype

Prototype Fitted onto Teri’s Chair

Testing the Prototype

From our prototyping process, we had to keep in mind that this was simply that – a prototype. It was difficult to shift our thinking from creating a tangible product in our first attempt as we were motivated to see it come together quickly and to be able to present it to our client during the prototype critique. However, as we worked, we realized the importance of the formative critique: its purpose was to be able to simulate a real and future product to test and receive additional feedback from our client before completing the final product.

As a result, we were able to experiment with different types of foot controlled movement including a foot pedal, distance sensor, pressure sensor, and two differently structured rollers in the interest of figuring out what our client would like better and prefer on the final product. Although they were designed to not be strategically placed on the product, the purpose was to be able to have enough space to have our client test each one and see what would work best for her and what she is capable of doing. In addition, we were able to come up with a base for our product that would lie exactly on her foot plate to the most precise measurements as well as trying out different ways to make it structurally sound. 

From the critique, we were able to listen to our client and incorporate her suggestions and preferences into our final design. Specifically, we were able to determine which foot movement options that Teri preferred and worked best for her. We decided on: two foot pedals, and one roller (thinner in width and greater height). We were able to visually see how much space we needed for each one and optimize the limited space we had on her foot plate as well as determining the placement of these three items based on which side Teri was more controlled in and comfortable with. We decided on having the foot pedals on the rightmost and leftmost sides and with the roller in the middle between the foot pedals. Lastly, we were able to see and test the stability of our structure. We settled on using more metal screw rods on our final product since the metal steel beams were a bit unstable. We were able to achieve all the goals we wanted during the prototype critique and answer all the questions that arose during the prototyping process, as well as make clarifications and affirm our process with our client. There were no surprises that we encountered nor was there any feedback that we did not implement.

Organizing our Ideas Before Work Mode Was Engaged

Roller from Prototype Connected to Potentiometer (Eventually Switched to Rotary Encoder)

Final Prototype Assembly

Process

From our formative critique where we got Teri’s feedback on the prototype as it was, we realized we would need to ditch the pressure and distance sensors included for a simpler set of mechanisms – one large roller, which would allow her to comfortably move it back and forth with her foot, and two footpads. In an effort to make the design sturdier than our cardboard prototype (which did not hold up), we worked on creating a fully-fleshed out version of our design in Fusion360 to ensure that it was just as we imagined after laser cutting. And though we did make some last minute changes when putting together the design, through changing the material and testing it, we knew it was going to be a lot more sturdy than it was when Teri last tried it. However, our design for the roller went through a ton of changes, illustrated by our seemingly never-ending whiteboard diagrams. To ensure that it was held up, we created tabbed pieces that would hold it up, and made sure it all fit with the rotary encoder. And once it was all put together inside the box, which was protected on all sides by the interior box made, we continued to perfect our code and work on the top of the instrument. The footpads required some extra thinking when designing to ensure the footpads were situated comfortably enough for Teri to actually interact with it, connected to the Arduino below, and sturdy enough that the movement of her foot would not break the mechanism. To solve this, we created tabbed pieces that would protect the electronics from being crushed, and used foam to ensure the footpads did not come down too harshly. The design of the physical instrument was a back and forth process that required us to put ourselves in Teri’s place, asking ourselves if each part would be sturdy enough, which expanded our perspective and helped our design.

Development of the Roller Mechanism (using whiteboard tables)

Fitting the Roller Into Place

Using a Saw to Cut Hinges for Footpads

The most involved part of the process was certainly the code. Thankfully, we were able to find an online source that provided detailed instructions on how to use the Arduino Pro Micro to send out MIDI signals (linked here), which helped us greatly. However, as we implemented our code to our prototype, a few unexpected errors came out. It turned out that the instruction from the previous link was somewhat different than what we needed for our instrument. So instead of using the Pro Micro Board in Sparkfun AVR Family, we selected the original Arduino Micro from the Arduino AVR Boards list, and we were able to communicate and push command onto our board. However, after a few changes in our code, an unexpected issue occurred – the board that we had completely stopped communicating with our computers even after resetting. Thus, we made a rushed decision to change our board to the Arduino Uno, which proposed a set of new challenges. First, unlike the Pro Micro, Uno only has a serial output, which makes it incapable of transmitting MIDI signals. Therefore, outside of a few code changes, the user must install MIDI to Serial Bridge and a Virtual Loopback MIDI Port software for it to function properly. The idea of asking our client to download additional software for our product to function was suboptimal at best, so we came back to Arduino Pro Micro once more to solve the problem. While we were trying to find an alternative solution using Arduino Uno, our professor was able to successfully reset the Pro Micro that we were using and this time, with more simplified code and careful inspection of all the electronics that we had connected, our board finally started to create MIDI signals! With a bit of modification, our board was able to communicate with the online MIDI synthesizer we had picked out (due to the ease of accessibility for our client), Midi CityAll of the issues that arose with the physical design and code interrupted our original Gantt chart that we had developed to keep ourselves on schedule, but ultimately, with some changes to our schedule, we were able to find enough time to complete our work before the final critique, where we got to share with Teri our final MIDI instrument design.

One of the Coding Errors we Encountered

Original Gantt Chart/Work Plan

Final Design Rendering in Fusion360

Conclusions and Lessons Learned

With the passing of the final critique, our team is very proud of the progress we have made since we first met our client with no ideas in mind and transforming it into a tangible product that is specifically tailored to them. With this in mind, we were able to present our final product to our client, Teri, and have her finally try it as well while receiving feedback from others as well. With this constructive criticism we have taken into consideration and reflected upon, we were able to hear from different and fresh perspectives from individuals that would definitely make great additions and improvements to our product in the prospective future. 

Specifically, one commented “[versatility in] fastening onto [a] wheel chair, [or an] application for infants,” which we believe could be a significant to our product in the future. Since this project was only tailored to our client specifically, we were narrowed in on making sure that this would fit Teri’s wheel chair as opposed to making a universal MIDI foot controller for all users. However, we agree that a more universal design would be very beneficial as it would expand the usage of this to a greater range of users. In addition, this would not only be constrained for use for those only in wheelchairs but a greater capacity than that. In addition, we could increase the target audience to children by making it more accessible or changing the format slightly so it would be easier to use in some sense or more customizable. Another comment we received concerning the design and structure of our product was “bolts are sticking up” which was something we didn’t fixate on during this process. However, if possible, we would have liked to make something was was more flat and with no hardware protruding out of the product so we definitely agree with this observation. 

In addition, some other comments included “consider changing the controls—percussion, tone, etc. ” and “I think with that the computer display was better integrated with the roller – like if it looked like a mouse was hovering over the pitch/instrument that you haven’t played yet”. We agree with these comments and definitely could make this assistive instrument more useful if we were able to receive assistance from someone with a musical background to help us figure out what would be a better combination of controls. Furthermore, since we were limited to using free softwares for MIDI controllers that were on the web, a big game changer in the future for our product would be creating a software from scratch that has all these capabilities and be more user friendly and compatible with our product. 

With this feedback, we were able to see what aspects we were able to excel in and some parts that could be improved in order to make this product better. Throughout the duration of the project, we were so excited and grateful to be working with Teri and not only getting to know her as a client but also as a person and learning more about what she loves to do. Having her be a part of our process was crucial to the development of our product as we were trying to make a product that she would love and have some use to in the future. From our discussions with Teri, we were able to learn so much from her and about her so that we were able to brainstorm products that she would enjoy greatly. Aside from the project, we were able to connect on similar hobbies we shared and in turn, this really helped us personalize this project more with her.

Overall, we greatly enjoyed this project and would have loved more time allocated to this project to make more than one final product and to be able to create more products and adjust accordingly to our client’s needs and feedback. We were able to connect with Teri more and it was a meaningful experience that we wouldn’t have gotten had it not been for this class. As for steps we would take differently, there isn’t really much we can say as this was a memorable experience that had exceeded our initial expectations. However, the only thing we would have liked more is more time with Teri and more time to work on this project with her and make it even better for her. It has definitely motivated us to pay more attention to our products and how it can affect the people we are making it for. We will continue to take this project in the future with us and keep what we learned in mind when working on our future projects.

Technical Details

Schematic and Block Diagram

Code

/* Foot controlled MIDI Instrument
 * 
 * The following code is written to interpret the input that the Arduino Pro Micro takes and turn it into a MIDI signal.
 * which will be sent through the USB port.
 * 
 * Basic function: The system currently consists of two switches and one rotary encoder.
 * When a switch is being actuated, a play signal is sent through the USB port as a MIDI signal
 * When the rotary encoder receives inputs on rotation, the pitch is changed accordingly.
 * 
 * Pin Mapping:
 * Arduino Pin  /  Role   /   Description
 * 2              Input     receive data from the rotary encoder
 * 3              Input     receive data from the rotary encoder
 * 5              Input     left foot pad/switches
 * 7              Input     right foot pad/switches
 * 
 * Ethan Hu, Sharon Li, Francesca Menendez, 2022
 * 
 * Referenced from Gustavo Silveira and Dolce Wang
 */
#include "MIDIUSB.h"  
#include <Encoder.h>

// SWITCHES
const int NSwitches = 2; //total number of switchs
const int switchPin[NSwitches] = {5, 7}; //pins for the switches
int switchCState[NSwitches] = {}; // stores the switch current value
int switchPState[NSwitches] = {}; // stores the switch previous value

// MIDI Assignments 
byte midiCh[NSwitches] = {0,9}; // MIDI channel to be used, can be change base on requirement
byte note = 36; // default pitch
                                            
// debounce
unsigned long lastDebounceTime[NSwitches] = {0};  // the last time the output pin was toggled
unsigned long debounceDelay = 100;    //the debounce time in ms

// Rotary Encoder
Encoder knob(2, 3);

void setup() {
  // Initialize buttons with pull up resistors
  for (int i = 0; i < NSwitches; i++) {
    pinMode(switchPin[i], INPUT_PULLUP);
  }
}

void loop() {
  for (int i = 0; i < NSwitches; i++) {
    switchCState[i] = digitalRead(switchPin[i]); 
    // checking debounce to avoid accidental double tap
    if ((millis() - lastDebounceTime[i]) > debounceDelay) {
      if (switchPState[i] != switchCState[i]) {
        lastDebounceTime[i] = millis();
        // changing pitch based on data from the rotary encoder
        note=int(knob.read()/2)+36;
        if (switchCState[i] == LOW) {
          // Sends the MIDI note ON
          noteOn(midiCh[i], note, 127);  // channel, note, velocity
          MidiUSB.flush();
        }
        else {
          // Sends the MIDI note OFF by 0 velocity
          noteOn(midiCh[i], note, 0);  // channel, note, velocity
          MidiUSB.flush();

        }
        switchPState[i] = switchCState[i];
      }
    }
  }
}

// Arduino MIDI functions MIDIUSB Library
void noteOn(byte channel, byte pitch, byte velocity) {
  midiEventPacket_t noteOn = {0x09, 0x90 | channel, pitch, velocity};
  MidiUSB.sendMIDI(noteOn);
}

void noteOff(byte channel, byte pitch, byte velocity) {
  midiEventPacket_t noteOff = {0x08, 0x80 | channel, pitch, velocity};
  MidiUSB.sendMIDI(noteOff);
}
]]>
Color Sensor by Team Lacerta: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/color-sensor-by-team-lacerta-final-documentation/ Thu, 15 Dec 2022 22:00:38 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16830 Overview:

For our final project, each group was paired with a client from CLASS (Community Living and Support Services), which is a “nonprofit organization that offers different services to individuals of varying abilities”. Our client is Bill, who has achromatopsia, which is the lack of cones in the eyes, so he is legally blind and 80% colorblind. For information on our interview with him, click here

What We Built:

During our initial interview, we found that Bill has trouble picking out matching outfits in the morning because of his colorblindness. To make this process more convenient for Bill, we proposed the following product: a gadget senses the Red, Green, Blue values of a fabric or clothing when a button is pressed and prints out the color and the values for the user. Note: Red, Green, Blue Values are between 0 and 255 and they are a representation of how much of that color is present. For example pure red would be 255, 0, 0.

Final Overall Image

Final Overall Image of Our Project.

 

Detail Photos

Detail #1: Button for Scanning, A Light Switch, and Power.

 

Detail Image #2: Laser Cut high contrast labels so our client can easily identify what each button does

 

Detail #3: 9V Battery Pack so the device can be used unplugged.

 

Detail #4: RGB Sensor cased in a 3D printed housing. White paper placed on the inside so it reflects the light and gets a better reading.

 

Detail #5: Adafruit EINK display to provide customizability and high contrast for our client to be able to easily read despite his issues with vision.

 

Final Working Video

Note: Our display does not update until the button is pressed so it will continue to display it’s previous scan. This is why it initially displays teal. Furthermore, due to the limitations of the EINK display it does take a some of time to update.

Narrative Sketch of How it Would Be Used

Bill wants to decide his outfit for the day, but he isn’t sure which colors he has in his wardrobe. He places a shirt on a flat surface, the device on top of it, and presses the button. The screen updates saying the shirt is a “dark green”. He then goes to grab a pair of pants to do the same. After scanning the pants the display reads “light orange”. Through his work with spreadsheets Bill has learned which colors go well together and how to decipher RGB values. So Bill knows light orange and dark green don’t match. He then grabs another pair of pants and after scanning the display reads “tan”. He thinks to himself “perfect!” and leaves for his day knowing that his outfit goes well together.

How We Got Here – Prototype & Process

Prototype

Our prototype was designed to help answer the design question: what are our boundaries for the colors? For example, at what point is green different from teal and teal different from blue? We also wanted to make sure Bill was happy with the new display that we have (E-ink) because he can’t see the traditional LCD screen colors (blue/white or green/black) very well.

Through our prototype we wanted to establish that the basic part of the device would function: Using the TCS34725 color sensor to read colors. We converted the RGB (Red, Green, Blue) reading from the sensor to HSV (Hue, Saturation, Value) values to make it more convenient for us to interpret and set the color boundaries. For the prototype, we focused on naming the hue (main color) correctly, since the saturation and value just add adjectives to that (light/dark). The boundaries were determined visually using the hue charts found online. We ended up with 9 main colors: pink, red, orange, yellow, green, teal, blue, purple, magenta.

Prototype Images:

This is the final overall image of our prototype.

Zoomed out photo of the prototype being tested with three different colored papers (Green, Pink, Orange).

For our prototype the color sensor was loose so we had to hold it to articles of clothing of sheets of colored paper.

Our Prototype Working:

Findings, Feedback, and Surprises from our Prototype

As for the answers to our questions, it was successful in terms of finding the right boundaries for each color so we moved to incorporating the saturation and value to the code as well. Bill was happy with the higher contrast in the E-ink display, but preferred a black background with white text for maximum contrast, which was the opposite of the prototype.

Successfully converting the background color to black and text color to white on the screen

During our prototype critiques we only had one scan button for the sensor, but after getting feedback we added another switch for the sensor light so that the client can turn on/off the light depending on the situation. Additionally, we were advised to place the sensor inside the box because then it will be less affected by outside light sources, so we 3D printed a case for the sensor. We also added a power switch because we planned to use batteries to make it more portable. Bill picked the placement of the switches, display, and sensor. He also picked the color and material of the box.

First draft of the 3D laser cut of the box after the prototype meeting

During our prototype critique session, we asked Bill if he would like an audio that would read the color out loud, list colors that go well with the detected color, or specify the grayscale colors for white, gray, and black. However, he wanted the design to be kept simple and just have the color sensing part working. He could also read RGB values so he could tell the shade of the grayscale if we were to just label the color “grayscale.”

Our biggest surprise was that Bill understands how to interpret RGB, which we think is less intuitive than HSV. He says it’s because he’s made presentations before, and picks colors for those by inputting the RGB directly, so he has figured out how it works. So we can add RGB to the display so he has more detailed information to work with, since a group of HSV inputs all map to the same color description.

A close up photo of the screen displaying both the color name and its RGB values.

Process

Let’s start off with our small takeaways. We were able to increase the font size and invert the display colors by reading the documentation for Adafruit GFX, since it’s shared for the E ink display. Similarly for the sensor sensitivity: once we put it in the box, the environment became darker (even with the light on), so it was reading dark colors for too many things. So by increasing the sensitivity and increasing the reading time allotted, we were able to make it more accurate.

Software

For any hue, we can vary the saturation and value to get a large range of other colors. Specifically, how light/dark the color is, as you can see in the picture below.

Example of HSV with hue = 0, and saturation and value from 0 to 100, which become the x and y axis of a coordinate plane.

Thus we have a nice coordinate system (x axis is saturation, y axis is value), which we can apply standard geometry math to. The first thing we noticed was the grayscale along the left edge and bottom: what we can see as the actual color somewhat fits within this boundary.

Boundary drawn to separate the colors vs grayscale (looks like a quarter circle).

This boundary is a quarter of a circle, with the center at (100,100), and a radius of about 98 (after testing). So our first step is to figure out if a point is within the circle or not. If it is in the circle, then it has a discernible color that we should categorize further. Otherwise, it’s grayscale and we can just display that.

A point (x,y) is within the circle if this equation is true: (x-100)2 + (y-100)2 ≤ 982

(this is just the standard circle equation with the center and radius as named above). So we plug in our given saturation (x) and value (y). If the point is within the circle, we need to categorize it further into light/dark. The medium category would be redundant (since we usually think of a medium level color as the standard color), so we don’t have the adjective.

Boundaries drawn to give light/medium/dark options (look like pie slices).

The boundary cutoffs for light/dark look like pie slices. Each pie slice takes up some amount of angle space in the circle. Recall that a circle has 360° or 2π radians (equivalent forms of measurement) in it.

Circle in degrees and radians.

So we can tell which “pie slice” a point is in by looking at the angle the point makes. For example, if the point’s angle was 200° , we’d know it would belong to the pie slice on the left, just below the horizontal line (all angles between 180° and 225° belong to that pie slice).

How to determine the angle? We use the function atan2(y,x), which takes in the x and y coordinates and returns the angle in radians, between [-π, π]. Positive radians are measured counter clockwise, while negative radians are measured clockwise, so for example, 7π/4 would be the same angle as -π/4, and 3π/2 would be -π/2, and π would be -π. This is basically the top half of the circle, but reflected over the x-axis, and with an added negative sign.

Note: the x, y in atan2 aren’t saturation and value directly, but x = sat – 100 and y = val – 100, because atan2 assumes those x,y values are relative to a circle centered at (0,0), but since our circle is centered at (100,100) we need to offset it so the math will be correct. Finally, we used boundary angles of [-π, -7π/8] for light and [-5π/8, -π/2] for dark, which was found through testing.

Final caveat for angles: if x,y happen to be in the center of the circle, then it doesn’t really have an angle, so we know it’s a pure color and not light/dark so we don’t bother calculating the angle.

 

Brown/tan time! We want to include these colors because they are in Bill’s wardrobe. However, brown/tan aren’t in the main colors because you can’t tell if something is brown/tan just by looking at the hue. Instead, brown is a specific combo of a red through yellow range, with more medium saturation/value ranges. Tan is a small chunk of the brown range.

The pattern where the browns appear is that they can be found in this “pie crust” section. 

Concentric circles with different radii: the in between section is “brown” (looks like pie crust).

We have 2 concentric circles (same centerpoint), but with different radii. So the band between them is the pie crust. We can determine if it’s in the pie crust by calculating the distance between the center of the circles and the point (radius). The formula is r = (x2 + y2) .5 and again, x=sat – 100, y = val – 100 to adjust for the center of the circle not being at (0,0). We came up with the bounds for the smaller and larger radius as 60 and 75, which was determined visually.

Tan is a specific shade of brown, with HSV = (334, 33, 82) which we found online. But those exact coordinates are unlikely to be measured. In addition, coordinates that are “close enough” can still be visually interpreted as tan. We decided close enough in this case is ±10 in each of the HSV parameters. So we use the radius equation to check if we’re less than the radius of the tan circle (10), where x = sat – 33 and y = val – 82 to adjust for the center of the circle.

Tan location is the dot. Sufficiently close HSV can still be called “tan”, so the circle around it is the region where everything inside is “tan”.

The overall steps:

  1. Check light/dark/grayscale
  2. If not grayscale, tan/brown, hue
  3. Print all relevant information

Reducing Memory & Transitioning to Arduino Micro

We chose the Arduino Micro because it would be smaller and more compact for the box. However, that also means less memory storage for the code. With the original code, it took up about 106% of total space in the Micro, even though there was plenty of space left over in the Arduino Uno. The main culprits were data types and Strings. For instance, variables were int or double, which can represent a very large range of positive and negative values as well as decimals. However, we know saturation and value are in [0,100]. Thus, we can reduce the saturation and value data types to bytes (can represent [0,255]), which takes a lot less space than int. Same goes for some of the x, y, radius variables used earlier. The other strategy was to decrease the length of strings, or delete Serial.print statements (since those aren’t on the screen anyways). For example, the checkSatVal function that names if a color is light/dark (or medium or grayscale) used to return “grayscale”. The main function would check if the return value is “grayscale” to decide whether to print the hue/other information. However, this means “grayscale” is just being used as a flag, so we can reduce the return string to “g”, since we know what it corresponds to, and just check for “g”, thus saving some character space. Similar thing with returning “medium”; by rearranging the if statement logic, it can return an empty string since we weren’t planning to print medium either way, so the empty string actually makes it easier to concatenate the display output later. With these modifications we now use 95% of the storage space.

Hardware:

For Hardware changes we transitioned from the project being breadboarded to it being 100% soldered. We largely did this with headers that can screw in wires to allow us to still make changes while still being able to move wires around and troubleshoot issues.

Fully soldered circuit for our final project

Soldered backboard for E-INK so it can be held in place and properly wired while still allowing for us to remove the screen for measurements

One of the main issues we ran into with the soldering aspect was getting the screen to work properly. Troubleshooting with the E-INK display was difficult as it was hard to tell if it was issues with wiring, code, or the screen itself. The screen only updates when its given a signal, and at the start even the demo code (example code given with library) wasn’t working with the soldered connections. After many hours of troubleshooting and re-soldering joints to make them more solid the screen was finally able to work and run the demo code which is shown in the video below.

We also managed to get the device battery powered. This was difficult as the screen didn’t seem to want to boot up properly off solely battery power. This lead to our slightly wacky solution. To boot up the device it needs to be plugged into a computer but then if you flip the power switch and unplug the device it will work perfectly until it is powered off again. We are still slightly unsure why exactly this is but after doing research on the E-INK displays we noticed that they need A/C power or a Computer to boot up properly and the battery simply provides DC power. This is why it couldn’t boot up the screen but it could keep it running after it was already booted.

Design Process:

The box design was finalized after our prototype meeting because we got to ask Bill where he would like the buttons located, the type of buttons, material of the box and other features he might want on the box like a handle or a hook. As for the buttons, we did some test trials as to which worked better. We decided to have labels for the buttons because we had three on the top side and they were all for different purposes. He did not mind too much for the material of the box but something like black could work. He planned on keeping the device on a shelf near his closet so he did not need a hook or a handle. The back side of the device is also removable by screws so if there happens to be any hardware issues or if a part breaks we can easily remove the face and make the changes.

The sensor holder was 3D printed instead of laser cutted because it was so small. We initially decided on black plastic for the material but after testing the sensor we found that due to the darkness of the holder walls, the color values read darker than its actual value. Therefore, we had to line the inside of the holder with white paper walls so that the led light from the small bulb that is part of the sensor would have a surface to bounce the light off of.

Front View of our 3D printed color sensor housing.

Back view of our 3D printed sensor housing

A preliminary model of our housing to test if all the electrical components fit properly.

Schedule

As expected, we got the basic color coding done early and kept making minor adjustments as we put the device together and started adding components like buttons, battery, etc. The box design was also done on time, but adjusting the measurements took longer because there were a lot of parts cased inside the box that needed to fit and not break inside the box. For the hardware, soldering was done on time however it took longer than expected because adding the external power source had some issues with our screen as we explained in the hardware section of our process.

Conclusion

 

We learned a lot from interacting with Bill and the other clients from CLASS. Overall we gained a much bigger appreciation for people living with disabilities and how influential and life changing technology is in their lives.

If we were to redo this project we would definitely choose a different screen. During our feedback session, we received a lot of comments on the “time delay for color identification.” And, although we checked with Bill during our prototype meeting and he said he didn’t mind the delay, it was a component we wanted to fix in future designs. The E-Ink display provides the high contrast we needed, however, a screen with similar contrast that doesn’t take as long to display would speed up the process of scanning and allow us to display more feedback to the user. 

We think most of the problems we face with the nature of the device such as the “flashing lights” and requiring “AC power” could be fixed simply by finding a different screen to work with. However, because these problems were realized during the process of creating the device, it was difficult to start from scratch with a new screen when we didn’t have all the other components ready. Therefore, if we were to take this project further we would like to get a variety of screens and by plugging it into our now well made system and just testing which screen is the most appropriate to do the task in the most effective way possible.

Also something we didn’t consider was only turning on the light when the button was pressed. We had a light switch but in hindsight that switch was largely unnecessary as the encasing completely closes off the scanners from external light. This makes it so when the light is off it simply reads black. So, during our in-person feedback, one comment regarding how to modify this feature was to add a “photocell or contact switch to know if it’s on a surface to trigger the light” of the sensor.

A simple change that we implemented immediately after receiving feedback was observation about the “sharp edges”. It was the last finishing detail of the box we completely blanked out on, and we agreed that it is definitely ideal to sand down the acrylic edges so that the box is more holdable.

There are definitely other factors that could be modified such as adding a “sound” or a “handle”, but we really wanted this project to be about Bill and what he wanted on the device. These ideas did come up in our prototype meeting, but Bill wanted to keep the device simple and just be able to read color so that is what we created. Our main goal of the project was not only making a successful output, but creating something that might actually be helpful for Bill and something he will make use of on a daily basis. We are so thankful that we could work with Bill for this project and we learned a lot from him, so we really wanted to make a positive contribution to Bill’s life and just make him happy. 🙂

 

Block Diagram & Schematic

 

Electrical Schematic of our Color Sensor

 

Block Diagram for our Project

Code

/*
Color Sensor by Team Lacerta
by Jonathan Lindstrom, Sarah Yun, Freda Su

Reads the color as RGB using the TCS34725 sensor, then converts it to HSV for easier interpretation.

Hue can easily be divided into sections because a certain range of hue goes with a certain color.
However, the sat/val (decides light/dark/grayscale) doesn't have straight cutoffs, so we use circle
math to divide a hue into pie slices (for light/medium/dark adjective), and the sat and val coordinates
will determine which pie slice it lands on. If it lands outside the pie, then it must be grayscale. We
also manually check for brown/tan since that's a color in Bill's wardrobe that doesn't naturally occur
in the hue spectrum (hue is red to yellow, with a sat and val that ends up on the crust of the pie).

Finally, we send all of this info to the eink display, which is only updated when the button is pressed
to prevent it from constantly refreshing, which is bad for the lifespan. We also have a light switch to
help control the power use if he wants to leave the power on for a while.

pinouts reference: https://learn.adafruit.com/adafruit-2-13-eink-display-breakouts-and-featherwings/pinouts

    Name     | Arduino pin |       Sensor Pin      | description
 ------------|-------------|-----------------------|----------------------------------------------------
  EPD_CS     |      9      |  ECS on EINK          | E-Ink Chip Select, required for controlling the display
  EPD_DC     |      10     |  EDC on EINK          | Data/Command pin, required for controlling the display
  SRAM_CS    |      6      |  SRCS on EINK         | SRAM Chip Select, required for communicating with the onboard RAM chip.
  EPD_RESET  |      8      |  RST on EINK          | This is the E-Ink Reset pin, can set to -1 and share with microcontroller Reset
  EPD_BUSY   |      7      |  BUSY on EINK         | this is the e-Ink busy detect pin, and is optional if you don't want to connect the pin
  BUT        |      4      |  Push Button          | Used to tell the screen when to display a color
  LIGHT      |      A0     |  LIGHT on RGB Sensor  | Used to be able to switch this light on/off to conserve power
  SWITCH     |      A2     |  Slide Switch         | Used to tell the light when to turn on/off
  SCK        |      15     |  SCK on EINK          | SPI Clock Pin required for EINK and SRAM
  MOSI       |      14     |  MOSI on EINK         | SPI Microcontroller Out Serial In pin, it is used to send data to SRAM and e-Ink display
  MISO       |      16     |  MISO on EINK         | SPI Microcontroller In Serial Out pin, its used for the SRAM
  SDA        |      2      |  SDA on RGB Sensor    | Used to provide SDA connection to Adafruit Color Sensor
  SCL        |      3      |  SCL on RGB Sensor    | Used to provide SCL connection to Adafruit Color Sensor


documentation/resources referenced:
https://www.geeksforgeeks.org/program-change-rgb-color-model-hsv-color-model/?ref=gcse
https://learn.adafruit.com/adafruit-gfx-graphics-library
https://learn.adafruit.com/adafruit-eink-display-breakouts/overview
https://learn.adafruit.com/adafruit-color-sensors/library-reference
*/

#include <Wire.h>
#include "Adafruit_ThinkInk.h"
#include "Adafruit_TCS34725.h"
#include "Adafruit_GFX.h"
#include "Fonts/FreeSans18pt7b.h"

#define EPD_CS 9
#define EPD_DC 10
#define SRAM_CS 6
#define EPD_RESET 8
#define EPD_BUSY 7
#define BUT 4
#define LIGHT A0
#define SWITCH A2

//Makes it so the screen only tried to update once everytime the button is pressed
bool buttonOnce = false; 

ThinkInk_213_Mono_BN display(EPD_DC, EPD_RESET, EPD_CS, SRAM_CS, EPD_BUSY);
Adafruit_TCS34725 tcs = Adafruit_TCS34725(0x00, TCS34725_GAIN_16X);
//700ms integration time, gain <16x (sensitivity of sensor, adjust to enviro)


void setup() {

  pinMode(BUT, INPUT);
  pinMode(LIGHT, OUTPUT);
  Serial.begin(115200);
  while (!Serial) {
    delay(10);
  }
  //Set up EINK display
  display.begin(THINKINK_MONO);
  if (tcs.begin()) {
    Serial.println("Found");
  } else {
    Serial.println("No TCS34725 found ... check your connections");
    while (true)
      ;
  }
 
  display.setTextColor(EPD_WHITE);
  display.setFont(&FreeSans18pt7b);
  digitalWrite(LIGHT, HIGH);
}

void loop() {
  display.clearBuffer();
  
  //Set up color Sensor
  uint16_t r, g, b, c, colorTemp, lux;
  tcs.getRawData(&r, &g, &b, &c);
  colorTemp = tcs.calculateColorTemperature_dn40(r, g, b, c);
  lux = tcs.calculateLux(r, g, b);

  r = r >> 8; //Convert rgb down to 8 bits in binary
  g = g >> 8;
  b = b >> 8;
  rgb_to_hsv(r, g, b);
}

//convert rgb to hsv, calls helper fxn to compute color and adjectives to display, along w rgb
void rgb_to_hsv(uint16_t r2, uint16_t g2, uint16_t b2) {

  double r = r2 / 255.0;
  double g = g2 / 255.0;
  double b = b2 / 255.0;

  // h, s, v = hue, saturation, value
  double cmax = max(r, max(g, b));  // maximum of r, g, b
  double cmin = min(r, min(g, b));  // minimum of r, g, b
  double diff = cmax - cmin;        // diff of cmax and cmin.
  double h = -1, s = -1;

  // if cmax and cmax are equal then h = 0
  if (cmax == cmin) {
    h = 0;
  }

  // if cmax equal r then compute h
  else if (cmax == r) {
    h = fmod(60 * ((g - b) / diff) + 360, 360);
  }

  // if cmax equal g then compute h
  else if (cmax == g) {
    h = fmod(60 * ((b - r) / diff) + 120, 360);
  }

  // if cmax equal b then compute h
  else if (cmax == b) {
    h = fmod(60 * ((r - g) / diff) + 240, 360);
  }

  // if cmax equal zero
  if (cmax == 0) {
    s = 0;
  } else {
    s = (diff / cmax) * 100;
  }
  // compute v
  double v = cmax * 100;

  if (digitalRead(SWITCH))
  {
    digitalWrite(LIGHT, HIGH);
  }
  else
  {
    digitalWrite(LIGHT, LOW);
  }

  while (digitalRead(BUT)) {
    buttonOnce = true;
  }
  if (buttonOnce) {
    display.fillScreen(EPD_BLACK);
    String sn = checkSatVal(s + .5, v + .5);  //crude rounding function
    display.setCursor(0, display.height()/2 - 10);  //have more space default 1 line
    if (sn == "g")
    {
      display.println("grayscale");
    }
    else
    {
      String brown = checkBrown(h + .5, s + .5, v + .5);
      String color = pickColorHue(h + .5);
      
      if (brown.length() + sn.length() + color.length() <= 16)  //max num char per line
      {
        display.print(sn);
        display.print(brown);
      }
      else  //reformat to be 2 lines (dont overflow characters)
      {
        display.setCursor(0, display.height()/3 - 10);  //top left corner ish
        display.print(sn);
        display.println(brown);
        display.setCursor(0, display.getCursorY() - 5);
      }
      display.println(color);
    }

    //rgb display under the color name
    display.setCursor(0, (display.height() + 100) / 2);
    display.print("(");
    display.print(r2);
    display.print(", ");
    display.print(g2);
    display.print(", ");
    display.print(b2);
    display.print(")");
    display.display();
    buttonOnce = false;
    delay(5000);  //prevent fast button presses to protect screen life
  }
}

//determine "base" color
String pickColorHue(int hue) {
  if (hue >= 0 && hue <= 9) {
    return ("pink");
  } else if (hue <= 30) {
    return ("orange");
  } else if (hue <= 70) {
    return ("yellow");
  } else if (hue <= 160) {
    return ("green");
  } else if (hue <= 200) {
    return ("teal");
  } else if (hue <= 255) {
    return ("blue");
  } else if (hue <= 300) {
    return ("purple");
  } else if (hue <= 340) {
    return ("pink");
  } else if (hue <= 350) {
    return ("magenta");
  } else if (hue <= 360) {
    return ("red");
  } else {
    return ("");  //wont happen bc hue range 0 to 360
  }
}

//check for brown specifically bc not in main colors of hue spectrum 
String checkBrown(int hue, byte sat, byte v) {
  if ((0 <= hue) && (hue <= 50)) {
    //brownish area, also check for tan area
    byte tanRad = 10;  //determine radius visually online
    //core tan color: (34, 33, 82)
    if ((24 <= hue) && (hue <= 44))  //hue is within +/-10 of tan hue
    {
      //circle around core color
      int x = sat - 33;  //saturation = x
      int y = v - 82;    //val = y
      byte r = sqrt(pow(x, 2) + pow(y, 2));
      if (r <= tanRad) {
        return "tan ";
      }  //otherwise, fall thru to normal brown case
    }

    byte lowRad = 60; //boundaries for the crust of pie
    byte highRad = 75;
    int x = sat - 100;  //saturation = x
    int y = v - 100;    //val = y
    byte r = sqrt(pow(x, 2) + pow(y, 2));
    if ((lowRad <= r) && (r <= highRad))  // in "crust" of pie
    {
      return "brown ";
    } else {
      return "";
    }
  } else {
    return "";
  }
}

//determine light/dark/grayscale for color
String checkSatVal(byte sat, byte v) {
  int circleEq = pow((sat - 100), 2) + pow((v - 100), 2);

  byte rad = 98;  //inside pie radius

  if (circleEq <= (pow(rad, 2))) {
    int x = sat - 100;  //saturation = x
    int y = v - 100;    //val = y
    double angle = atan2(y, x);
    byte r = sqrt(pow(x, 2) + pow(y, 2));
    //edge case: not in a pi slice if in the corner
    if ((r == 0) || ((-7 * M_PI / 8 <= angle) && (angle <= -5 * M_PI / 8))) return "";  //just pure color
    if (angle < -7 * M_PI / 8)                                                          // between -pi and -7pi/8
    {
      return "light ";
    } else  // between -5pi/8 and -pi/2
    {
      return "dark ";
    }
  } else {  //outside of pie: grayscale

    return "g";  //flag to remove for display: dont put adjectives for white/gray/black
  }
}

 

Link for Files:

DXF files for the box and 3D model for the sensor housing: Files

]]>
Emotional Display by Team Vela: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/emotional-display-by-team-vela-final-documentation/ Mon, 12 Dec 2022 09:05:35 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16856 In this project, we as a team worked together to create a device to improve the life of a physically disabled person living in Pittsburgh. Crucially, the device is tailor made to our client, useful and relevant for them in particular, driven by their wants and needs, and nobody else’s. Designing over the course of seven weeks with our client Dennis, we conducted a needfinding interview, distilling into a concept for his device. (The notes from that interview can be found here: https://courses.ideate.cmu.edu/60-223/f2022/work/team-vela/)

 

What We Built

We created a light-up emotional display, allowing Dennis to show 5 different emotions ranging from happy to sad. These emotions correspond to five distinct light colors, from green (happy) to red (sad). These emotions and lights are controlled via a dial on an accessible control panel. Additionally, if Dennis has a question, he can press a button on his control panel, turning the lights purple. Finally, if Dennis has an emergency and needs to get someone’s attention, he can flip a two-part switch to turn the lights red and play a noise. 

Final prototype with face display and control box.

Side profile of face plate, showing the three acrylic layers, screw attachments, LEDs, and wiring.

Heat shrink-wrapped wires and exit hole from the back of the face plate.

Control box. From top to bottom: red emergency switch and dial for noise volume, blue button (toggling the question mode) and on/off switch, yellow button (toggling emotions mode) and dial for emotional status.

Emergency switch with the cover pushed back, primed to be flipped.

Textural “breadboard” feature, as requested by Dennis.

Close-up of emotions mode toggle button and emotional status dial.

Narrative Sketch

Dennis is out to lunch in Shadyside with some of his friends and staff from CLASS. His company is good, the food tastes great, the weather is nice, and he is having an excellent time. So, Dennis flips the on switch to his emotions display and turns the dial to show to others that he is happy being out to lunch with everyone. In addition to informing the people with him, the bright green lights and smiling face clearly display positive emotion, encouraging the strangers at the restaurant to approach Dennis to say hi—an opportunity to make new friends! After a great conversation, Dennis goes back to his lunch. 

Later, Dennis decides that his sandwich is a little bland and he would like to put some salt on it. Unfortunately, he is unable to reach across the long table to the salt shaker. Dennis presses the button on his control panel to indicate that he has a question. The display lights up purple and displays a neutral face. A light on his control panel displays this purple as well, informing the people at the table that Dennis would like some help. They fortunately notice it quite quickly. Dennis is able to ask for the salt for his sandwich, which is passed to him. 

By using the emotional face display, Dennis has been able to make himself both more approachable and better understood. Furthermore, it is able to ease some of this social frustrations by letting the people around him know he has an issue quickly and efficiently. 

 

How We Got Here (Prototype and Process)

This prototype was designed to help answer the design question: How can we help Dennis communicate with the world better?

Prototype

Our prototype was a light-up emotional display, capable of showing five different “emotions” through shades of color across three faces. Additionally, the display has two additional modes to the emotions mode: question, with an additional color, and emergency, which plays noise. 

Our single prototype was slowly constructed across prototyping its parts, eventually coming together into the final construction.

Sketch of control panel and other accessories.

Three of our cardboard cupholder prototypes.

First sketch of face plate model with potential placement of LEDs. These were later significantly reduced.

First laser cut of the control panel, with potential buttons and switches inserted.

Second laser cut of control panel, this time out of wood, with new buttons and switches inserted.

 

Measurements taken to plan integration and development of the cup holder.

 

Development process and testing of the wiring/circuitry.

 

Testing of soldered circuitry and integration with PCB board.

 

As we worked through our prototyping process, we wanted to be cognizant of creating a device that Dennis could use easily, taking into account his limited hand stability and mobility due to arthritis. This was particularly true with the control panel for the display. Although the less flashy part of the design, the control panel was also ultimately what Dennis would be experiencing the device through. If our panel couldn’t be used, neither could the device as a whole. 

In a subsequent meeting with Dennis to review our prototype, we came out with much helpful feedback. Dennis was happy with how the device was coming along overall and happy with the progress we had made thus far—but it was not perfect. By discussing with Dennis, we found that it’s easier for him to have larger buttons with slightly larger space in between those buttons. Tied to this, we found that it would be better with a slightly larger total control panel. Originally, we were trying to create the smallest possible panel in hopes that a smaller panel would impede Dennis’ and Dennis’ wheelchair as little as possible, but decided that Dennis’ comfort in using the panel was a very reasonable compromise for a slightly larger design. Importantly, Dennis was also able to quickly pick up on the interface we had designed. In terms of adjacent feedback, we asked Dennis about how he would like his device to appear visually. He was happy with our physical models, but did specifically request that his device was orange and potentially included some Steelers decals. This was inspired by another wheelchair user he had met who had pink accessories to match her pink wheelchair. Finally, we asked Bill, who helps to care for Dennis at CLASS if noise implementation for the “emergency” mode would become disturbing. He didn’t think so, so the noise translated into our final design. 

Overall, we incorporated all of the feedback we received from Dennis and Bill into our final prototype. The critique—both negative and positive—truly helped to inform our total design process and steer us in the right direction for our final implementation. Seeing the delight created by the device as a whole was a great driver for the next parts of the process. 

Process

A very stress-free process photo, including first iteration cardboard control board and Remy testing code.

Final 3D print of Dennis’ cup holder.

Our process, as many processes often are, was wrought with things not quite working out the way we were planning. We started by following the Gantt chart closely, but that quickly fell to the wayside—to our detriment. We were on track for the CAD designs and electronics programming, but fell behind beginning with the circuit build. 

Testing the LEDs in series before soldering them together.

Laser cutting the final control box. 

Attempting to fit all of the wiring into the final control box.

Dennis with his final emotional display (photo requested by Dennis).

To our surprise, the circuit build was one of the most challenging parts of the project. In particular, wiring together the individual LEDs for the smiles proved to be more intensive than we had originally thought. We had to plan in four individual circuits (the smile, neutral mouth, frown, and connecting middle section) as we weren’t exactly sure how to light up the individual LEDs, despite that they were addressable. Creating the four individual circuits was challenging. The first two attempts at wiring were faulty, so both had to be restarted. Then, once the circuits were finished being soldered together, they had to go into the face display, which presented a new set of problems. The LED series were incredibly finicky to work with. Often, the wrong series would light up or the lights would blink in multiple colors rather than staying the single color assigned to them in code. We figured out that a data-in line was somehow wired into a ground line group, which caused our initial round of issues. Additionally, we figured out that some of the pins of the LEDs were touching each other as they sat behind the middle faceplate, causing further disturbances within the feedback. These problems were never really even solved, the lights continuing to be finicky all the way through the presentation. 

With respect to the Gantt chart, the set back caused by the problems we had with the LED series pushed back other plans, throwing off the scheduling of the rest of the chart. The laser cutting, which was intended to finish six days before the final prototype presentation, was only finished two days before. Similarly, the final assembly of the device occurred the night before the final presentation rather than the intended two days prior. A large part of our lack of speed as a group was due to unbalanced experience. For many parts along the process we became reliant upon Remy, who has far more experience with physical production of electronics, to confirm details and tasks. This, in turn, interrupted Remy’s process, creating a positive feedback loop. 

Then, in a final blow for the presentation, we could not find any D batteries or AAA batteries to externally power our device. This means that we are still not sure if the device works purely off of external power, or for how long it is able to run off that external power. The process as a whole had many moments like this, where we really found ourselves at a loss. Yet despite our challenges and the final close call, the final product is one that we are proud of. 

 

Conclusions and Lessons Learned

In reflection of our journey through this project, a few things really stood out to us. 

Firstly, our final critique and feedback session had a lot of important thoughts to offer. While we were so focused on developing a product that would communicate to everyone around Dennis how he was feeling, our product was limited to giving a bold display to those behind him. While we had a small LED on the control panel that those talking with him could see, there was no larger display for those speaking or interacting with our client. One of the our critics summarized some of the general feedback for improvements quite well when that stated that the system could be improved with, “more animation, add a time-out feature, reduce size and use as necklace? Use LED strip instead of separate LEDs, brightness knob.” These features would greatly expand the functionality and usability of our product and would be our main additions for future iterations. Furthermore, a point was made that this device could be usable to any individual with some form of disability that inhibits their ability to communicate emotions, demonstrating the devices importance as a product on a grander scale. Beyond this, it seemed like our attention to detail in preserving the importance of aesthetics and cleanliness didn’t go unnoticed. The remaining feedback fell into one of the aforementioned groups of feedback, and the quote above contains elements many of the critics mentioned could be included. The advice to animate the eyes is a simple change that would not only add additional emotional complexity to the display but would also reduce power usage in our system (as it would require less total LEDs to be lit up).

Secondly, we learned a lot about creating bespoke technology, particularly for disabled people. In our initial meetings, we didn’t know how to be attentive to unspoken issues yet. For many disabled people, the core problems in their lives are beyond unsolvable for a couple students doing a semester project. More so, as even non-disabled people experience problems, they don’t think of them as problems or are unwilling to speak about things as problems. A person often doesn’t know their own issues and, if they do, to voice them is very heavy social weight. We all went into the project with the notion that our client would tell us about one of these problems, not considering that Dennis may not have problems or talk about those problems. We assumed that Dennis must have a problem that he would want fixed. So focused on trying to get Dennis to describe a problem in his life, we were barely open to offering potential problems that we saw ourselves in Dennis’ life. We are incredibly grateful to have worked with Dennis for this project and his patience with us in doing so. 

In conclusion, although our final product was far from perfect, we are proud of what we made. It was an interesting concept, decently executed given our collective restraints, and, most importantly to us, it was a product that Dennis was excited about. At the end of our final presentation, he asked if we could strap it on for him and there is no better affirmation than that. 

 

Technical

Schematic and Block Diagram

Circuit schematic for system.

 

Block Diagram of the system.

Code

/* 
 *  The code below is used to operate the emotional display board
 *  developed for 60-223: Intro. to Physical Computing. The wiring
 *  inputs/outputs are as shown below.
 *  
 *   Pin Map:
   Pin   |  role  |   Description
   ----------------------------------------
   A0    | input  | Input readings from pot 1. For light color
   A1    | input  | Input readings from pot 2. For speaker volume
   2     | output | Controls the sound sent to the speaker
   3     | output | Writes to the neopixel ring LEDs
   5     | input  | Detects digital signal from emergency switch
   6     | input  | Detects signal to activate emotion display, button 1
   7     | input  | Detect signal to activate question display, button 2
   8     | output | Writes to inner middle mouth LEDs
   9     | output | Writes to outer middle mouth LEDs
   10    | output | Writes to LEDs showing smile
   11    | output | Writes to LEDs showing frown
   12    | output | Writes to LED displaying emotion on control panel
   5V    | output | Power supply for LEDS and buttons
   3.3V  | output | Power for remaining components
   GND   | input  | Ground for all components
 */

#include <Adafruit_NeoPixel.h>

// Define all variables and constants
const int pot1 = A0;
const int pot2 = A1;

const int switch1 = 5;
const int but1 = 6;
const int but2 = 7;
const int neoRing1 = 3;
const int three_leds = 8;
const int four_leds = 9;
const int smile = 10;
const int frown = 11;
const int emots = 12;

const int speaker = 2;

int sensOne;
int sensTwo;

int pressOne;
int pressTwo;
int pressThree;

int level = 0;
int emotion = 0;

unsigned long globTime;
unsigned long beepTime;

bool beepOn = 0;

// Define LED strips
Adafruit_NeoPixel strip = Adafruit_NeoPixel(32, neoRing1, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels(3, three_leds, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels2(4, four_leds, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels3(10, smile, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels4(11, frown, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels5(1, emots, NEO_GRB + NEO_KHZ800);

void setup() {
  Serial.begin(9600);

  // Initialize LEDs and define pin modes
  strip.begin();
  strip.setBrightness(30); //adjust brightness here
  strip.show(); // Initialize all pixels to 'off'

  pinMode(but1, INPUT_PULLUP);
  pinMode(but2, INPUT_PULLUP);
  pinMode(switch1, INPUT_PULLUP);

  pinMode(pot1, INPUT);

  pixels.begin();
  pixels2.begin();
  pixels3.begin();
  pixels4.begin();
  pixels5.begin();
}

// Below function used to write to the LEDs for question face
void lightQuestion(){
  // Insert code to light question mark
  //colorWipe(strip_mid.Color(255, 0, 255), 50);
  for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(0, 255, 255));
      pixels.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 255, 255));
      pixels2.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
  }
  
  colorWipe(strip.Color(255, 0, 255), 50); // Purple
}

void colorWipe(uint32_t c, uint8_t wait) {
  for(uint16_t i=0; i<strip.numPixels(); i++) {
      strip.setPixelColor(i, c);
      strip.show();
      //delay(wait);
  }
}

// Below function used to write to the LEDs for emergency face
void lightExclam(){
  // Insert code to light exclamation mark
  //colorWipe(strip_mid.Color(255, 0, 0), 50);
  for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(0, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 255, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
  }
  
  colorWipe(strip.Color(255, 0, 0), 50); // Red
}

// Initialize speaker and send sound signal
void initSound(){
  globTime = millis();

  if(abs(globTime - beepTime) >= 3000 && !beepOn){
    tone(speaker, 2000, 600);
    beepTime = globTime;
    beepOn = 1;
  }

  if(abs(globTime - beepTime) >= 1000 && beepOn){
    noTone(speaker);
    beepTime = globTime;
    beepOn = 0;
  }
}

// Level 0: emotional control lights
// Level 1: Question mark light
// Level 2: Exclamation mark and sound control
void loop() {
  if(level == 0){
    sensOne = analogRead(pot1);

    Serial.println(sensOne);

    emotion = map(sensOne, 0, 700, 0, 5);
    emotion = round(emotion);

    //Serial.println(emotion);
    
    lightEmotion();
    
    switchControls();
  }

  if(level == 1){
    lightQuestion();
    switchControls();
  }

  if(level == 2){
    lightExclam();
    // Might not need, just use potent as variable resistor to change volume instead of
    // doing in software.
    //sensTwo = analogRead(pot2);
    switchControls();
    initSound();
  }
}

// Below function used to read any changes in current mode of the system
void switchControls() {
  // insert detection for button presses to switch controls
  pressOne = digitalRead(but1);
  pressTwo = digitalRead(but2);
  pressThree = digitalRead(switch1);

  //Serial.println(level);

  if(!pressThree){
    level = 2;
  }

  if(!pressTwo && pressThree){
    level = 1;
  }

  if(!pressOne && pressThree){
    level = 0;
  }

  if(pressOne && pressTwo && pressThree && level == 2) {
    level = 0;
  }
}

// Below function used to write to the LEDs depending on current emotion
void lightEmotion() {
  if(emotion == 0){
    // light red angry
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(0, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 255, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
  
    colorWipe(strip.Color(255, 0, 0), 50); // Red
    pixels5.setPixelColor(0, pixels5.Color(0, 255, 0));
    pixels5.show();
  }

  if(emotion == 1){
    // light orange unhappy
    //colorWipe(strip_mid.Color(255, 55, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(55, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(55, 255, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(255, 55, 0), 50); // Orange
    pixels5.setPixelColor(0, pixels5.Color(55, 255, 0));
    pixels5.show();
  }

  if(emotion == 2){
    // light yellow neutral
    //colorWipe(strip_mid.Color(255, 255, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(255, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(255, 255, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(255, 255, 0), 50); // Yellow
    pixels5.setPixelColor(0, pixels5.Color(255, 255, 0));
    pixels5.show();
  }

  if(emotion == 3){
    // light yellow-green happy
    //colorWipe(strip_mid.Color(100, 255, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(255, 100, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(255, 100, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(100, 255, 0), 50); // Yellow-Green
    pixels5.setPixelColor(0, pixels5.Color(255, 100, 0));
    pixels5.show();
  }

  if(emotion == 4){
    // light green very happy
    //colorWipe(strip_mid.Color(0, 255, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(255, 0, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(255, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(0, 255, 0), 50); // Green
    pixels5.setPixelColor(0, pixels5.Color(255, 0, 0));
    pixels5.show();
  }
}
]]>