Intro to Physical Computing: Student Work Fall 2022 https://courses.ideate.cmu.edu/60-223/f2022/work Intro to Physical Computing: Student Work Tue, 20 Dec 2022 23:43:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.0.8 Back-up Alarm by Team Fornax: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/back-up-alarm-by-team-fornax-final-documentation/ Sun, 18 Dec 2022 05:11:51 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16847 For this project, we worked in teams of three to design a device that would be useful for a person living with a disability. Each team worked alongside a client from Community Living and Support Services (CLASS) to create something what would be relevant and useful in their lives. Our client was Jeff Owens, an individual with a mobile disability. Through the course of this project we would conduct an interview with Jeff and incorporate his feedback into the final product. To read more about our interview with Jeff, click here.

What We Built

Our end product for Jeff was a device that he could strap onto the bottom of his wheelchair using Velcro strips. This device would help ensure that he would not back up into/onto the objects behind him. Jeff gets in the range of an object behind him the device would beep until he is a reasonable distance away from the object. In additional, the device includes lights that would flash on and off at the same time. Jeff has total control of this device, he has the ability to turn off the device entirely or just the beeping. Similarly, Jeff can adjust the range in which the device starts beeping at.

This is the main photo of the device with all its parts. All the velcro is unstrapped in this photo and there is a loose LED strip in the middle that is connected to the control panel.

This is the control panel close up. It contains a switch that turns off sound (it has sound labeled underneath), and it has a switch that turns off lights and sound (has lights labeled underneath). The switches are like this because we wanted Jeff to have the ability to stop lights and sounds up at the control panel where he can reach. The blue knob is a potentiometer for adjusting the distance the sensor scans for.

This is one of our ultra sonic sensors that is suppose to be strapped to the behind lower part of wheelchair with its one other counterpart. As you can see, you mount this by strapping the velcro tightly around the wheelchair rods.

This is a picture showing what the back of everything looks like, and it’s suppose to show you how you would strap the velcro if it was on the wheelchair.

This is the device on Jeff’s wheelchair at the critique. We didn’t get much time with it but we found it mostly worked except for attaching the underneath box with the Arduino. We must have incorrectly measured on prototype day because the velcro straps were not quite long enough. Otherwise it was kinda successful!

 

Narrative Sketch

After recording another album review for this YouTube, Jeff decides to take a quick nap in his wheelchair. Unknowingly to Jeff at the same time as he begins to drift away his two-year old nephew decided to take out all his toys and begin to play with them all around Jeff. Very quickly Jeff’s nephew gets bored of playing inside and decides to go outside, leaving all his toys still on the ground around Jeff.

After Jeff wakes up from his nap, he realizes that he is surrounded by all his nephew’s toys. Jeff turns on the back-up alarm. Jeff waits and does not hear a beep. He now knows that there is a safe path behind him that he can take to get out of the living room. Jeff has saved himself from getting hurt from accidentally tipping over and has saved himself from hurting his nephew’s feelings by accidentally breaking one of this toys.

How We Got Here

Prototype

The prototype we created was designed to answer the question: How can we help Jeff move safely from Point A to Point B?

Our prototype was more on the simpler side as we wanted to gauge Jeff’s opinion on our approach. The prototype took the shape of a rectangular cardboard box with optical proximity sensors poking out of the frontside. In addition, we mocked up a control panel on paper for Jeff to take a look at. When we showed our prototype, we focused a lot of explaining how device would interact upon certain actions.

This is us testing which sensors to use for our project. We ended up going with the ultrasonic sensor because it had a larger range it detected which would help minimize problems later we suspected. We brought this set up to the prototype critique but the design and wizard of oz prototypes ended up helping us communicate better with Jeff.

This was the underneath box prototype, we talked about how the sensors may end up on the side of this box and be put on the lower half of the wheelchair, however, after looking at Jeff’s wheelchair we decided this wasn’t going to happen.

This is the paper control panel mockups. We had Jeff put his finger on the imaginary buttons to see what was big enough. We also held it next to his arm rest and asked which size was most legible. We found the biggest one was the only viable option after these two tests.

Wizard of Ozzing of the LED Strip

This is what the serial monitor looked like when we were testing the distance sensors before the prototype session.

Here you can see us ideating after Jeff’s interview, trying to get an idea for the prototype critique. We struggled with this because we found Jeff didn’t have any complaints about his daily life. So we tried fleshing out/expanding all three of our ideas and talked to Zach about it. We ended up telling Jeff the problem space we picked but let him decided if he liked this route or not.

The main helpful info we got from the prototype session other than a solid direction for our project was a lot of measurements. This is the main page of notes that documented our measurements through a picture and writing. We ended up referencing this image a lot in the process.

In our mind, our device would mount underneath Jeff’s wheelchair and help detect objects in front of him. From the Prototype Critique, we gained valuable feedback that helped patch up holes from our initial interview with Jeff. Moreover, the Prototype Critique redefined the purpose of our device. Instead of detecting objects in front of Jeff, that he can already see, we learned that it was much more helpful if the device could detect objects behind Jeff as he backs up. This led us to separate the optical proximity sensor from being embedded within the device. We decided to move these sensors to these two metal rods already pointing from the back of Jeff’s wheelchair because these rods were already at an angle.

At the end of the Prototype Critique, we decided to acknowledge and include all the feedback we received from Jeff and the other CLASS clients. As people who do not have much experience with wheelchairs, we believed it was in the best interest to consider all the information we were given as our client knows himself the best. Furthermore, we are making this device for Jeff, hence his opinion would help us finalize our ideas for the final iteration. The main problem we encountered while prototyping was we found that getting the distance sensor accurate was pretty difficult because of angle and the sensor moving around, thus explaining why we wizard of oz the prototype instead.

Process

This is the original code flow chart. This was used to help organize the code initially and figure out how we wanted the information to flow through the device.

This was all our wiring before soldering. This was definitely a lot to tackle and difficult to read when initially soldering.

Results from calibration. This is what we documented when testing the potentiometer. We needed to figure out the highest the calibration should go.

This was the first prototype of the control panel. We ended up deciding the velcro should only strap underneath, because this was extremely unstable and wouldn’t really stay up. We also decided not to do a second potentiometer, and decided to enlarge the hole to a square so the potentiometer was easier to pinch.

This was us testing the ultrasonic sensor holder. We found it to be a little wide and we decided we want a hole to embed the sensor instead of have it sit on top or the wood.

This was us realizing two things. We made a mistake wiring the switches, and the bread board was not fitting in the control panel neatly. This lead us to a lot of re-soldering.

Through this project, we ran into several roadblocks. Firstly, after we had decided on the project, we struggled with deciding exactly how we wanted to go about the user feedback. We had to make some executive decisions on using lights and a buzzer and forgoing vibration. We then had some issues getting our buttons wired, but we then figured out how to get them to work because we found that we accidentally wired the incorrect holes in the soldering board. Afterwards, we then ran into some issues with exactly how we were going to mount the ultrasonic distance sensors on the wheelchair. We realized that the sensors could have easily been bumped, throwing off the distances. We then had to add a potentiometer, though we already had a working setup and working code. After, we ran into several issues with soldering, with the solder on the LEDs easily falling off. This caused several issues, but at the end of the day, we ended up with a working product.

A lot of the issues we ran into were found the night before. This was because we diverged a bit with our projected Gantt chart. We spent the class period before the project’s due date still working, since we had no sense of urgency, and put in little time outside of class. Instead of soldering and assembling the product, which is where we ran into the majority of our problems, we continued trying to refine our code and hardware. If we had dedicated that class period to assembling, we would have had more time to debug, and more time to dedicate to details.

Planned Gantt Chart

Conclusions and Lessons Learned

Our group had a great time with this project and we were happy with the product we produced during the final critique day. That being said, there is always room for improvement, and we received a lot of helpful feedback that we could have implemented, feedback that could have improved our project. 

For example, we received multiple critiques on our non-shrink-wrapped wires, such as “exposed wiring needs shrink wrap,” and “The wires […] would benefit from more protective covering as well”. We thought this was a good critique, as it was one way that our project could have been elevated to the next level. During the final build stage, we were considering adding the shrink wrap to our project, but we ultimately ran out of time. Another critique was that a stable mounting system would have helped a lot with our project, and I am very inclined to agree. The problem that we encountered was that we didn’t have access to Jeff’s wheelchair, and a more complicated mounting system would have taken up a lot of time, and been very difficult to create in the scope of our class. We ended up using velcro for mounting, since it was the simplest option, but I do agree that with a more stable mounting system we would have been able to make the product a lot more effective. 

There was also feedback that complimented our use of the wheelchair, and considering the situation that Jeff himself was in. “[T]he group did a nice job figuring out how to make the device fit on his chair best”. The idea of personalization vs. generalization was a critique that was discussed a lot in person as well. We tried to make it very clear that the goal of this project was not to make a manufacturable product, but to make a project for Jeff himself. That being said, I think some of our guests were excited about the possibility that our product had if we were able to generalize it a bit. I think they appreciated how many people our product might have been able to help, which is a critique that is easy to accept. Another positive critique that we received was that the user feedback options that we had were good choices. We got multiple compliments on the visibility and clarity of the lights, but then one guest commented that for the visually impaired, the idea of the buzzer for sound feedback was also a good option. We appreciated this critique because it took us a while to decide what might be the most effective method of user feedback.  

Working with a client with a disability was a good experience for us. We had to be careful about how we worded interview questions, and it was a bit hard to communicate with Jeff, but we managed to find aspects of his life that we could improve and build our project on. In order to do that, we had to dive fairly deeply into what a day in his life looked like. I don’t think there was anything we would have done differently. We tried to be as open as we could when it came to communication with our client, and I think we did a fairly good job, even though our client was not as responsive as we might have hoped. 

I think all of our group members had a good time with this project. The diversity of backgrounds and skills that we brought into this project helped it run smoothly. We learned how to make things, not just for ourselves or for this class, but for other people. Something that really stuck with me was the impact that our work had on so many people. There were multiple clients talking about how important and life-changing the work we were doing was. Though we’ve only been through one semester of this class, we were able to see the applications of our knowledge in a way that was very fulfilling and meaningful.

Technical Details

Electronic Schematic and Block Diagram

Electronic Schematic

 

Block Diagram

Code

/**
 * @title Back-up Alarm
 * @brief A useful device designed for Jeff
 * 
 * 60-223: Introduction to Physical Computing
 *
 * The following code initializes two optical proximity that serve
 * as eyes of the back of Jeff's wheelchair. If Jeff backs within
 * a certain distance of object, the led strip will light up and
 * the buzzer will buzz. The code also gives Jeff the freedom to
 * adjust the upperbound distance using a potentiometer. 
 *
 * @authors Ethan Lu <ethanl2@andrew.cmu.edu>
 *          Frances Adiwijaya <fda@andrew.cmu.edu>
 *          Gia Marino <gnmarino@andrew.cmu.edu>
 *
 * @mapping
 *  Arduino Pin |   Role   |   Description   
 *  ------------|----------|-----------------
 *      A0         INPUT    Potentiometer
 *      3          INPUT    Buzzer Control Button
 *      4          INPUT    Device Control Button
 *      5          OUTPUT   Buzzer
 *      9          INPUT    ECHO Pin for Right Sensor
 *      10         INPUT    TRIGGER Pin for Right Sensor
 *      11         INPUT    ECHO Pin for Left Sensor
 *      12         INPUT    TRIGGER Pin for Left Sensor
 *      13         OUTPUT   LED Strip
 */

/** @brief Import libraries */
#include <NewPing.h>
#include <PololuLedStrip.h>
#include <assert.h>

/** @brief Declare constants */
#define POTENTIOMETER_PIN         A0

#define BUZZER_CONTROL_PIN        3
#define CONTROL_BUTTON_PIN        4
#define BUZZER_PIN                5
#define RIGHT_ECHO_PIN            9
#define RIGHT_TRIGGER_PIN         10
#define LEFT_ECHO_PIN             11
#define LEFT_TRIGGER_PIN          12

#define LED_COUNT     60
#define MAX_DISTANCE 200
#define MAX_BUZZ       8

/** @brief Debugging macros */
#define requires(expr) assert(expr)
#define ensures(expr)  assert(expr)

PololuLedStrip<13> led_strip;
NewPing sonar_left(LEFT_TRIGGER_PIN, LEFT_ECHO_PIN, MAX_DISTANCE);
NewPing sonar_right(RIGHT_TRIGGER_PIN, RIGHT_ECHO_PIN, MAX_DISTANCE);
rgb_color colors[LED_COUNT];

unsigned lowerbound = 10;
unsigned upperbound =  0;

void fill(uint8_t r, uint8_t g, uint8_t b);

/**
 * @brief Declare pin modes
 */
void setup() {
  pinMode(RIGHT_ECHO_PIN, INPUT);
  pinMode(RIGHT_TRIGGER_PIN, INPUT);
  pinMode(LEFT_ECHO_PIN, INPUT);
  pinMode(LEFT_TRIGGER_PIN, INPUT);

  pinMode(CONTROL_BUTTON_PIN, INPUT);
  pinMode(BUZZER_CONTROL_PIN, INPUT);
  pinMode(POTENTIOMETER_PIN, INPUT);

  pinMode(BUZZER_PIN, OUTPUT);
}

/**
 * @brief Main routine
 */
void loop() {
  delay(100);
  upperbound = map(analogRead(POTENTIOMETER_PIN), 0, 1023, 60, 150);

  cleanup();
  if (digitalRead(CONTROL_BUTTON_PIN) == HIGH) {
    unsigned int left_distance = (sonar_left.ping() / US_ROUNDTRIP_CM);
    unsigned int right_distance = (sonar_right.ping() / US_ROUNDTRIP_CM);

    /** Too close to an object */
    while ((lowerbound < left_distance && left_distance < upperbound) || (lowerbound < right_distance && right_distance < upperbound)) {
      fill(255, 0, 0);
      if (digitalRead(BUZZER_CONTROL_PIN) == HIGH) {
        buzz(min(left_distance, right_distance));
      }
      left_distance  = (sonar_left.ping() / US_ROUNDTRIP_CM);
      right_distance = (sonar_right.ping() / US_ROUNDTRIP_CM);
    }
  }
}

/**
 * @brief     Assigns a new rgb value to every element in the color array
 * @param[in] r Amount of red
 * @param[in] g Amount of green
 * @param[in] b Amount of blue
 */
void fill(uint8_t r, uint8_t g, uint8_t b) {
  for (uint16_t i = 0; i < LED_COUNT; i++) {
     colors[i] = rgb_color(r, g, b);
  }
  led_strip.write(colors, LED_COUNT);
}

/**
 * @brief     Activate the buzzer
 * @param[in] distance
 * @pre       `distance` is non-negative
 * @pre       `distance` is less than `MAX_DISTANCE`
 */
void buzz(unsigned long distance) {
  requires(distance < MAX_DISTANCE);
  int x = MAX_BUZZ - int_log2(distance);

  warn(x);
}

/**
 * @brief     Calculate log2 of an integer
 * @param[in] x
 * @return    log2(`x`)
 * @pre       `x` is non-negative
 */
int int_log2(int x) {
  requires(-1 < x);

  int c = 0;
  if (x == 0) return 1; 
  while ((x >>= 1)) { c++; }
  return c;
}

/**
 * @brief     Run the buzzer a number of times
 * @param[in] x The number of times
 * @pre       `x` is non-negative
 * @pre       `x` is less than or equal to 7
 */
void warn(int x) {
  requires(-1 < x);

  for (int i = 0; i < x; i++) {
    digitalWrite(BUZZER_PIN, HIGH);
    delay(50);
    digitalWrite(BUZZER_PIN, LOW);
    delay(50);
  }
  delay(500);
}

/**
 * @brief Turns the buzzer and led strip off
 */
void cleanup() {
  digitalWrite(BUZZER_PIN, LOW);
  fill(0, 0, 0);
}

Design File

Rhino file that was used to laser cut all the pieces of this device.

]]>
Foot Controlled MIDI Instrument by Team Andromeda: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/foot-controlled-midi-instrument-by-team-andromeda-final-documentation/ Fri, 16 Dec 2022 21:22:17 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16970 For our final project, our group was given the opportunity to work with a client with a disability from CLASS (Community Living And Support Services) to develop a personalized assistive device that would improve their lives in some aspect. Over the course of seven weeks, we were able to brainstorm, design, prototype, and assemble a device tailored particularly to our client’s wants and needs, Teri Owens. Specifically, drawing from her love of music which we discovered during our interview, we wanted to add onto the existing list of assistive musical instruments by creating a MIDI controller that would be controlled by certain foot movements to produce sound electronically. For more details and interview documentation with Teri, please visit this page.

What We Built

Our product is essentially a MIDI (Musical Instrument Digital Interface) controller that is controlled by different foot movements which is purposefully meant to be placed on the foot plate of our client’s wheelchair. In our case, there is a roller which is used to change the pitch of a note or the percussion instrument and two foot pedals. One of the foot pedals makes a note sound while the other makes a percussion sound. This device is then connected to a computer via USB where the user can operate on a free MIDI software site and select different tracks to change the sound that is amplified from the speakers.

 

Front

Side

Back

Angle

Dollar as Size Reference

Testing Our Final Design

Narrative Sketch

It is finally Friday morning, and Teri opens her eyes with excitement as she comes to the realization that today is her Spring Concert. Without waking the rest of the house by not connecting the MIDI foot controller to her tablet, she practices all the movements including tapping and rolling from memory, each one with its own particular level of precision following a specific beat. While moving her whole body to the beat, she smiles as she knows she has perfected her part of the percussion and notes just like she had practiced a million times beforehand.

Before she knows it, she is on stage in front of an audience where she is able to spot familiar faces. With a million thoughts running through her mind only thinking about her part of the concert, she plugs the USB cord into her tablet to get the MIDI controller started and selects the track: House 05. As she hears the other percussion instruments start and a five second pause, she knows it’s her turn to start playing as the other percussionist starts. By using her feet to move the roller in the direction away from her, she changes to a higher pitch. As she focuses on only stepping the right pedal, she plays the drums to the perfect and and steps on the left pedal to play the notes exactly from memory. After all her steps and rolls, she waits patiently for her next part to come up: a solo.

How We Got Here

Prototype

This prototype was designed to help answer the design question: How can we create another assistive musical instrument for Teri to be able to play? Our prototype was a fitted cardboard structure fitting of our client’s plate with five different components: a foot pedal, distance sensor, pressure sensor, and two differently shaped rollers of varied height and width. The cardboard structure was held up with six metal beams for stability.

Final Prototype

Electronics/Interior of Prototype

Prototype Fitted onto Teri’s Chair

Testing the Prototype

From our prototyping process, we had to keep in mind that this was simply that – a prototype. It was difficult to shift our thinking from creating a tangible product in our first attempt as we were motivated to see it come together quickly and to be able to present it to our client during the prototype critique. However, as we worked, we realized the importance of the formative critique: its purpose was to be able to simulate a real and future product to test and receive additional feedback from our client before completing the final product.

As a result, we were able to experiment with different types of foot controlled movement including a foot pedal, distance sensor, pressure sensor, and two differently structured rollers in the interest of figuring out what our client would like better and prefer on the final product. Although they were designed to not be strategically placed on the product, the purpose was to be able to have enough space to have our client test each one and see what would work best for her and what she is capable of doing. In addition, we were able to come up with a base for our product that would lie exactly on her foot plate to the most precise measurements as well as trying out different ways to make it structurally sound. 

From the critique, we were able to listen to our client and incorporate her suggestions and preferences into our final design. Specifically, we were able to determine which foot movement options that Teri preferred and worked best for her. We decided on: two foot pedals, and one roller (thinner in width and greater height). We were able to visually see how much space we needed for each one and optimize the limited space we had on her foot plate as well as determining the placement of these three items based on which side Teri was more controlled in and comfortable with. We decided on having the foot pedals on the rightmost and leftmost sides and with the roller in the middle between the foot pedals. Lastly, we were able to see and test the stability of our structure. We settled on using more metal screw rods on our final product since the metal steel beams were a bit unstable. We were able to achieve all the goals we wanted during the prototype critique and answer all the questions that arose during the prototyping process, as well as make clarifications and affirm our process with our client. There were no surprises that we encountered nor was there any feedback that we did not implement.

Organizing our Ideas Before Work Mode Was Engaged

Roller from Prototype Connected to Potentiometer (Eventually Switched to Rotary Encoder)

Final Prototype Assembly

Process

From our formative critique where we got Teri’s feedback on the prototype as it was, we realized we would need to ditch the pressure and distance sensors included for a simpler set of mechanisms – one large roller, which would allow her to comfortably move it back and forth with her foot, and two footpads. In an effort to make the design sturdier than our cardboard prototype (which did not hold up), we worked on creating a fully-fleshed out version of our design in Fusion360 to ensure that it was just as we imagined after laser cutting. And though we did make some last minute changes when putting together the design, through changing the material and testing it, we knew it was going to be a lot more sturdy than it was when Teri last tried it. However, our design for the roller went through a ton of changes, illustrated by our seemingly never-ending whiteboard diagrams. To ensure that it was held up, we created tabbed pieces that would hold it up, and made sure it all fit with the rotary encoder. And once it was all put together inside the box, which was protected on all sides by the interior box made, we continued to perfect our code and work on the top of the instrument. The footpads required some extra thinking when designing to ensure the footpads were situated comfortably enough for Teri to actually interact with it, connected to the Arduino below, and sturdy enough that the movement of her foot would not break the mechanism. To solve this, we created tabbed pieces that would protect the electronics from being crushed, and used foam to ensure the footpads did not come down too harshly. The design of the physical instrument was a back and forth process that required us to put ourselves in Teri’s place, asking ourselves if each part would be sturdy enough, which expanded our perspective and helped our design.

Development of the Roller Mechanism (using whiteboard tables)

Fitting the Roller Into Place

Using a Saw to Cut Hinges for Footpads

The most involved part of the process was certainly the code. Thankfully, we were able to find an online source that provided detailed instructions on how to use the Arduino Pro Micro to send out MIDI signals (linked here), which helped us greatly. However, as we implemented our code to our prototype, a few unexpected errors came out. It turned out that the instruction from the previous link was somewhat different than what we needed for our instrument. So instead of using the Pro Micro Board in Sparkfun AVR Family, we selected the original Arduino Micro from the Arduino AVR Boards list, and we were able to communicate and push command onto our board. However, after a few changes in our code, an unexpected issue occurred – the board that we had completely stopped communicating with our computers even after resetting. Thus, we made a rushed decision to change our board to the Arduino Uno, which proposed a set of new challenges. First, unlike the Pro Micro, Uno only has a serial output, which makes it incapable of transmitting MIDI signals. Therefore, outside of a few code changes, the user must install MIDI to Serial Bridge and a Virtual Loopback MIDI Port software for it to function properly. The idea of asking our client to download additional software for our product to function was suboptimal at best, so we came back to Arduino Pro Micro once more to solve the problem. While we were trying to find an alternative solution using Arduino Uno, our professor was able to successfully reset the Pro Micro that we were using and this time, with more simplified code and careful inspection of all the electronics that we had connected, our board finally started to create MIDI signals! With a bit of modification, our board was able to communicate with the online MIDI synthesizer we had picked out (due to the ease of accessibility for our client), Midi CityAll of the issues that arose with the physical design and code interrupted our original Gantt chart that we had developed to keep ourselves on schedule, but ultimately, with some changes to our schedule, we were able to find enough time to complete our work before the final critique, where we got to share with Teri our final MIDI instrument design.

One of the Coding Errors we Encountered

Original Gantt Chart/Work Plan

Final Design Rendering in Fusion360

Conclusions and Lessons Learned

With the passing of the final critique, our team is very proud of the progress we have made since we first met our client with no ideas in mind and transforming it into a tangible product that is specifically tailored to them. With this in mind, we were able to present our final product to our client, Teri, and have her finally try it as well while receiving feedback from others as well. With this constructive criticism we have taken into consideration and reflected upon, we were able to hear from different and fresh perspectives from individuals that would definitely make great additions and improvements to our product in the prospective future. 

Specifically, one commented “[versatility in] fastening onto [a] wheel chair, [or an] application for infants,” which we believe could be a significant to our product in the future. Since this project was only tailored to our client specifically, we were narrowed in on making sure that this would fit Teri’s wheel chair as opposed to making a universal MIDI foot controller for all users. However, we agree that a more universal design would be very beneficial as it would expand the usage of this to a greater range of users. In addition, this would not only be constrained for use for those only in wheelchairs but a greater capacity than that. In addition, we could increase the target audience to children by making it more accessible or changing the format slightly so it would be easier to use in some sense or more customizable. Another comment we received concerning the design and structure of our product was “bolts are sticking up” which was something we didn’t fixate on during this process. However, if possible, we would have liked to make something was was more flat and with no hardware protruding out of the product so we definitely agree with this observation. 

In addition, some other comments included “consider changing the controls—percussion, tone, etc. ” and “I think with that the computer display was better integrated with the roller – like if it looked like a mouse was hovering over the pitch/instrument that you haven’t played yet”. We agree with these comments and definitely could make this assistive instrument more useful if we were able to receive assistance from someone with a musical background to help us figure out what would be a better combination of controls. Furthermore, since we were limited to using free softwares for MIDI controllers that were on the web, a big game changer in the future for our product would be creating a software from scratch that has all these capabilities and be more user friendly and compatible with our product. 

With this feedback, we were able to see what aspects we were able to excel in and some parts that could be improved in order to make this product better. Throughout the duration of the project, we were so excited and grateful to be working with Teri and not only getting to know her as a client but also as a person and learning more about what she loves to do. Having her be a part of our process was crucial to the development of our product as we were trying to make a product that she would love and have some use to in the future. From our discussions with Teri, we were able to learn so much from her and about her so that we were able to brainstorm products that she would enjoy greatly. Aside from the project, we were able to connect on similar hobbies we shared and in turn, this really helped us personalize this project more with her.

Overall, we greatly enjoyed this project and would have loved more time allocated to this project to make more than one final product and to be able to create more products and adjust accordingly to our client’s needs and feedback. We were able to connect with Teri more and it was a meaningful experience that we wouldn’t have gotten had it not been for this class. As for steps we would take differently, there isn’t really much we can say as this was a memorable experience that had exceeded our initial expectations. However, the only thing we would have liked more is more time with Teri and more time to work on this project with her and make it even better for her. It has definitely motivated us to pay more attention to our products and how it can affect the people we are making it for. We will continue to take this project in the future with us and keep what we learned in mind when working on our future projects.

Technical Details

Schematic and Block Diagram

Code

/* Foot controlled MIDI Instrument
 * 
 * The following code is written to interpret the input that the Arduino Pro Micro takes and turn it into a MIDI signal.
 * which will be sent through the USB port.
 * 
 * Basic function: The system currently consists of two switches and one rotary encoder.
 * When a switch is being actuated, a play signal is sent through the USB port as a MIDI signal
 * When the rotary encoder receives inputs on rotation, the pitch is changed accordingly.
 * 
 * Pin Mapping:
 * Arduino Pin  /  Role   /   Description
 * 2              Input     receive data from the rotary encoder
 * 3              Input     receive data from the rotary encoder
 * 5              Input     left foot pad/switches
 * 7              Input     right foot pad/switches
 * 
 * Ethan Hu, Sharon Li, Francesca Menendez, 2022
 * 
 * Referenced from Gustavo Silveira and Dolce Wang
 */
#include "MIDIUSB.h"  
#include <Encoder.h>

// SWITCHES
const int NSwitches = 2; //total number of switchs
const int switchPin[NSwitches] = {5, 7}; //pins for the switches
int switchCState[NSwitches] = {}; // stores the switch current value
int switchPState[NSwitches] = {}; // stores the switch previous value

// MIDI Assignments 
byte midiCh[NSwitches] = {0,9}; // MIDI channel to be used, can be change base on requirement
byte note = 36; // default pitch
                                            
// debounce
unsigned long lastDebounceTime[NSwitches] = {0};  // the last time the output pin was toggled
unsigned long debounceDelay = 100;    //the debounce time in ms

// Rotary Encoder
Encoder knob(2, 3);

void setup() {
  // Initialize buttons with pull up resistors
  for (int i = 0; i < NSwitches; i++) {
    pinMode(switchPin[i], INPUT_PULLUP);
  }
}

void loop() {
  for (int i = 0; i < NSwitches; i++) {
    switchCState[i] = digitalRead(switchPin[i]); 
    // checking debounce to avoid accidental double tap
    if ((millis() - lastDebounceTime[i]) > debounceDelay) {
      if (switchPState[i] != switchCState[i]) {
        lastDebounceTime[i] = millis();
        // changing pitch based on data from the rotary encoder
        note=int(knob.read()/2)+36;
        if (switchCState[i] == LOW) {
          // Sends the MIDI note ON
          noteOn(midiCh[i], note, 127);  // channel, note, velocity
          MidiUSB.flush();
        }
        else {
          // Sends the MIDI note OFF by 0 velocity
          noteOn(midiCh[i], note, 0);  // channel, note, velocity
          MidiUSB.flush();

        }
        switchPState[i] = switchCState[i];
      }
    }
  }
}

// Arduino MIDI functions MIDIUSB Library
void noteOn(byte channel, byte pitch, byte velocity) {
  midiEventPacket_t noteOn = {0x09, 0x90 | channel, pitch, velocity};
  MidiUSB.sendMIDI(noteOn);
}

void noteOff(byte channel, byte pitch, byte velocity) {
  midiEventPacket_t noteOff = {0x08, 0x80 | channel, pitch, velocity};
  MidiUSB.sendMIDI(noteOff);
}
]]>
Color Sensor by Team Lacerta: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/color-sensor-by-team-lacerta-final-documentation/ Thu, 15 Dec 2022 22:00:38 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16830 Overview:

For our final project, each group was paired with a client from CLASS (Community Living and Support Services), which is a “nonprofit organization that offers different services to individuals of varying abilities”. Our client is Bill, who has achromatopsia, which is the lack of cones in the eyes, so he is legally blind and 80% colorblind. For information on our interview with him, click here

What We Built:

During our initial interview, we found that Bill has trouble picking out matching outfits in the morning because of his colorblindness. To make this process more convenient for Bill, we proposed the following product: a gadget senses the Red, Green, Blue values of a fabric or clothing when a button is pressed and prints out the color and the values for the user. Note: Red, Green, Blue Values are between 0 and 255 and they are a representation of how much of that color is present. For example pure red would be 255, 0, 0.

Final Overall Image

Final Overall Image of Our Project.

 

Detail Photos

Detail #1: Button for Scanning, A Light Switch, and Power.

 

Detail Image #2: Laser Cut high contrast labels so our client can easily identify what each button does

 

Detail #3: 9V Battery Pack so the device can be used unplugged.

 

Detail #4: RGB Sensor cased in a 3D printed housing. White paper placed on the inside so it reflects the light and gets a better reading.

 

Detail #5: Adafruit EINK display to provide customizability and high contrast for our client to be able to easily read despite his issues with vision.

 

Final Working Video

Note: Our display does not update until the button is pressed so it will continue to display it’s previous scan. This is why it initially displays teal. Furthermore, due to the limitations of the EINK display it does take a some of time to update.

Narrative Sketch of How it Would Be Used

Bill wants to decide his outfit for the day, but he isn’t sure which colors he has in his wardrobe. He places a shirt on a flat surface, the device on top of it, and presses the button. The screen updates saying the shirt is a “dark green”. He then goes to grab a pair of pants to do the same. After scanning the pants the display reads “light orange”. Through his work with spreadsheets Bill has learned which colors go well together and how to decipher RGB values. So Bill knows light orange and dark green don’t match. He then grabs another pair of pants and after scanning the display reads “tan”. He thinks to himself “perfect!” and leaves for his day knowing that his outfit goes well together.

How We Got Here – Prototype & Process

Prototype

Our prototype was designed to help answer the design question: what are our boundaries for the colors? For example, at what point is green different from teal and teal different from blue? We also wanted to make sure Bill was happy with the new display that we have (E-ink) because he can’t see the traditional LCD screen colors (blue/white or green/black) very well.

Through our prototype we wanted to establish that the basic part of the device would function: Using the TCS34725 color sensor to read colors. We converted the RGB (Red, Green, Blue) reading from the sensor to HSV (Hue, Saturation, Value) values to make it more convenient for us to interpret and set the color boundaries. For the prototype, we focused on naming the hue (main color) correctly, since the saturation and value just add adjectives to that (light/dark). The boundaries were determined visually using the hue charts found online. We ended up with 9 main colors: pink, red, orange, yellow, green, teal, blue, purple, magenta.

Prototype Images:

This is the final overall image of our prototype.

Zoomed out photo of the prototype being tested with three different colored papers (Green, Pink, Orange).

For our prototype the color sensor was loose so we had to hold it to articles of clothing of sheets of colored paper.

Our Prototype Working:

Findings, Feedback, and Surprises from our Prototype

As for the answers to our questions, it was successful in terms of finding the right boundaries for each color so we moved to incorporating the saturation and value to the code as well. Bill was happy with the higher contrast in the E-ink display, but preferred a black background with white text for maximum contrast, which was the opposite of the prototype.

Successfully converting the background color to black and text color to white on the screen

During our prototype critiques we only had one scan button for the sensor, but after getting feedback we added another switch for the sensor light so that the client can turn on/off the light depending on the situation. Additionally, we were advised to place the sensor inside the box because then it will be less affected by outside light sources, so we 3D printed a case for the sensor. We also added a power switch because we planned to use batteries to make it more portable. Bill picked the placement of the switches, display, and sensor. He also picked the color and material of the box.

First draft of the 3D laser cut of the box after the prototype meeting

During our prototype critique session, we asked Bill if he would like an audio that would read the color out loud, list colors that go well with the detected color, or specify the grayscale colors for white, gray, and black. However, he wanted the design to be kept simple and just have the color sensing part working. He could also read RGB values so he could tell the shade of the grayscale if we were to just label the color “grayscale.”

Our biggest surprise was that Bill understands how to interpret RGB, which we think is less intuitive than HSV. He says it’s because he’s made presentations before, and picks colors for those by inputting the RGB directly, so he has figured out how it works. So we can add RGB to the display so he has more detailed information to work with, since a group of HSV inputs all map to the same color description.

A close up photo of the screen displaying both the color name and its RGB values.

Process

Let’s start off with our small takeaways. We were able to increase the font size and invert the display colors by reading the documentation for Adafruit GFX, since it’s shared for the E ink display. Similarly for the sensor sensitivity: once we put it in the box, the environment became darker (even with the light on), so it was reading dark colors for too many things. So by increasing the sensitivity and increasing the reading time allotted, we were able to make it more accurate.

Software

For any hue, we can vary the saturation and value to get a large range of other colors. Specifically, how light/dark the color is, as you can see in the picture below.

Example of HSV with hue = 0, and saturation and value from 0 to 100, which become the x and y axis of a coordinate plane.

Thus we have a nice coordinate system (x axis is saturation, y axis is value), which we can apply standard geometry math to. The first thing we noticed was the grayscale along the left edge and bottom: what we can see as the actual color somewhat fits within this boundary.

Boundary drawn to separate the colors vs grayscale (looks like a quarter circle).

This boundary is a quarter of a circle, with the center at (100,100), and a radius of about 98 (after testing). So our first step is to figure out if a point is within the circle or not. If it is in the circle, then it has a discernible color that we should categorize further. Otherwise, it’s grayscale and we can just display that.

A point (x,y) is within the circle if this equation is true: (x-100)2 + (y-100)2 ≤ 982

(this is just the standard circle equation with the center and radius as named above). So we plug in our given saturation (x) and value (y). If the point is within the circle, we need to categorize it further into light/dark. The medium category would be redundant (since we usually think of a medium level color as the standard color), so we don’t have the adjective.

Boundaries drawn to give light/medium/dark options (look like pie slices).

The boundary cutoffs for light/dark look like pie slices. Each pie slice takes up some amount of angle space in the circle. Recall that a circle has 360° or 2π radians (equivalent forms of measurement) in it.

Circle in degrees and radians.

So we can tell which “pie slice” a point is in by looking at the angle the point makes. For example, if the point’s angle was 200° , we’d know it would belong to the pie slice on the left, just below the horizontal line (all angles between 180° and 225° belong to that pie slice).

How to determine the angle? We use the function atan2(y,x), which takes in the x and y coordinates and returns the angle in radians, between [-π, π]. Positive radians are measured counter clockwise, while negative radians are measured clockwise, so for example, 7π/4 would be the same angle as -π/4, and 3π/2 would be -π/2, and π would be -π. This is basically the top half of the circle, but reflected over the x-axis, and with an added negative sign.

Note: the x, y in atan2 aren’t saturation and value directly, but x = sat – 100 and y = val – 100, because atan2 assumes those x,y values are relative to a circle centered at (0,0), but since our circle is centered at (100,100) we need to offset it so the math will be correct. Finally, we used boundary angles of [-π, -7π/8] for light and [-5π/8, -π/2] for dark, which was found through testing.

Final caveat for angles: if x,y happen to be in the center of the circle, then it doesn’t really have an angle, so we know it’s a pure color and not light/dark so we don’t bother calculating the angle.

 

Brown/tan time! We want to include these colors because they are in Bill’s wardrobe. However, brown/tan aren’t in the main colors because you can’t tell if something is brown/tan just by looking at the hue. Instead, brown is a specific combo of a red through yellow range, with more medium saturation/value ranges. Tan is a small chunk of the brown range.

The pattern where the browns appear is that they can be found in this “pie crust” section. 

Concentric circles with different radii: the in between section is “brown” (looks like pie crust).

We have 2 concentric circles (same centerpoint), but with different radii. So the band between them is the pie crust. We can determine if it’s in the pie crust by calculating the distance between the center of the circles and the point (radius). The formula is r = (x2 + y2) .5 and again, x=sat – 100, y = val – 100 to adjust for the center of the circle not being at (0,0). We came up with the bounds for the smaller and larger radius as 60 and 75, which was determined visually.

Tan is a specific shade of brown, with HSV = (334, 33, 82) which we found online. But those exact coordinates are unlikely to be measured. In addition, coordinates that are “close enough” can still be visually interpreted as tan. We decided close enough in this case is ±10 in each of the HSV parameters. So we use the radius equation to check if we’re less than the radius of the tan circle (10), where x = sat – 33 and y = val – 82 to adjust for the center of the circle.

Tan location is the dot. Sufficiently close HSV can still be called “tan”, so the circle around it is the region where everything inside is “tan”.

The overall steps:

  1. Check light/dark/grayscale
  2. If not grayscale, tan/brown, hue
  3. Print all relevant information

Reducing Memory & Transitioning to Arduino Micro

We chose the Arduino Micro because it would be smaller and more compact for the box. However, that also means less memory storage for the code. With the original code, it took up about 106% of total space in the Micro, even though there was plenty of space left over in the Arduino Uno. The main culprits were data types and Strings. For instance, variables were int or double, which can represent a very large range of positive and negative values as well as decimals. However, we know saturation and value are in [0,100]. Thus, we can reduce the saturation and value data types to bytes (can represent [0,255]), which takes a lot less space than int. Same goes for some of the x, y, radius variables used earlier. The other strategy was to decrease the length of strings, or delete Serial.print statements (since those aren’t on the screen anyways). For example, the checkSatVal function that names if a color is light/dark (or medium or grayscale) used to return “grayscale”. The main function would check if the return value is “grayscale” to decide whether to print the hue/other information. However, this means “grayscale” is just being used as a flag, so we can reduce the return string to “g”, since we know what it corresponds to, and just check for “g”, thus saving some character space. Similar thing with returning “medium”; by rearranging the if statement logic, it can return an empty string since we weren’t planning to print medium either way, so the empty string actually makes it easier to concatenate the display output later. With these modifications we now use 95% of the storage space.

Hardware:

For Hardware changes we transitioned from the project being breadboarded to it being 100% soldered. We largely did this with headers that can screw in wires to allow us to still make changes while still being able to move wires around and troubleshoot issues.

Fully soldered circuit for our final project

Soldered backboard for E-INK so it can be held in place and properly wired while still allowing for us to remove the screen for measurements

One of the main issues we ran into with the soldering aspect was getting the screen to work properly. Troubleshooting with the E-INK display was difficult as it was hard to tell if it was issues with wiring, code, or the screen itself. The screen only updates when its given a signal, and at the start even the demo code (example code given with library) wasn’t working with the soldered connections. After many hours of troubleshooting and re-soldering joints to make them more solid the screen was finally able to work and run the demo code which is shown in the video below.

We also managed to get the device battery powered. This was difficult as the screen didn’t seem to want to boot up properly off solely battery power. This lead to our slightly wacky solution. To boot up the device it needs to be plugged into a computer but then if you flip the power switch and unplug the device it will work perfectly until it is powered off again. We are still slightly unsure why exactly this is but after doing research on the E-INK displays we noticed that they need A/C power or a Computer to boot up properly and the battery simply provides DC power. This is why it couldn’t boot up the screen but it could keep it running after it was already booted.

Design Process:

The box design was finalized after our prototype meeting because we got to ask Bill where he would like the buttons located, the type of buttons, material of the box and other features he might want on the box like a handle or a hook. As for the buttons, we did some test trials as to which worked better. We decided to have labels for the buttons because we had three on the top side and they were all for different purposes. He did not mind too much for the material of the box but something like black could work. He planned on keeping the device on a shelf near his closet so he did not need a hook or a handle. The back side of the device is also removable by screws so if there happens to be any hardware issues or if a part breaks we can easily remove the face and make the changes.

The sensor holder was 3D printed instead of laser cutted because it was so small. We initially decided on black plastic for the material but after testing the sensor we found that due to the darkness of the holder walls, the color values read darker than its actual value. Therefore, we had to line the inside of the holder with white paper walls so that the led light from the small bulb that is part of the sensor would have a surface to bounce the light off of.

Front View of our 3D printed color sensor housing.

Back view of our 3D printed sensor housing

A preliminary model of our housing to test if all the electrical components fit properly.

Schedule

As expected, we got the basic color coding done early and kept making minor adjustments as we put the device together and started adding components like buttons, battery, etc. The box design was also done on time, but adjusting the measurements took longer because there were a lot of parts cased inside the box that needed to fit and not break inside the box. For the hardware, soldering was done on time however it took longer than expected because adding the external power source had some issues with our screen as we explained in the hardware section of our process.

Conclusion

 

We learned a lot from interacting with Bill and the other clients from CLASS. Overall we gained a much bigger appreciation for people living with disabilities and how influential and life changing technology is in their lives.

If we were to redo this project we would definitely choose a different screen. During our feedback session, we received a lot of comments on the “time delay for color identification.” And, although we checked with Bill during our prototype meeting and he said he didn’t mind the delay, it was a component we wanted to fix in future designs. The E-Ink display provides the high contrast we needed, however, a screen with similar contrast that doesn’t take as long to display would speed up the process of scanning and allow us to display more feedback to the user. 

We think most of the problems we face with the nature of the device such as the “flashing lights” and requiring “AC power” could be fixed simply by finding a different screen to work with. However, because these problems were realized during the process of creating the device, it was difficult to start from scratch with a new screen when we didn’t have all the other components ready. Therefore, if we were to take this project further we would like to get a variety of screens and by plugging it into our now well made system and just testing which screen is the most appropriate to do the task in the most effective way possible.

Also something we didn’t consider was only turning on the light when the button was pressed. We had a light switch but in hindsight that switch was largely unnecessary as the encasing completely closes off the scanners from external light. This makes it so when the light is off it simply reads black. So, during our in-person feedback, one comment regarding how to modify this feature was to add a “photocell or contact switch to know if it’s on a surface to trigger the light” of the sensor.

A simple change that we implemented immediately after receiving feedback was observation about the “sharp edges”. It was the last finishing detail of the box we completely blanked out on, and we agreed that it is definitely ideal to sand down the acrylic edges so that the box is more holdable.

There are definitely other factors that could be modified such as adding a “sound” or a “handle”, but we really wanted this project to be about Bill and what he wanted on the device. These ideas did come up in our prototype meeting, but Bill wanted to keep the device simple and just be able to read color so that is what we created. Our main goal of the project was not only making a successful output, but creating something that might actually be helpful for Bill and something he will make use of on a daily basis. We are so thankful that we could work with Bill for this project and we learned a lot from him, so we really wanted to make a positive contribution to Bill’s life and just make him happy. 🙂

 

Block Diagram & Schematic

 

Electrical Schematic of our Color Sensor

 

Block Diagram for our Project

Code

/*
Color Sensor by Team Lacerta
by Jonathan Lindstrom, Sarah Yun, Freda Su

Reads the color as RGB using the TCS34725 sensor, then converts it to HSV for easier interpretation.

Hue can easily be divided into sections because a certain range of hue goes with a certain color.
However, the sat/val (decides light/dark/grayscale) doesn't have straight cutoffs, so we use circle
math to divide a hue into pie slices (for light/medium/dark adjective), and the sat and val coordinates
will determine which pie slice it lands on. If it lands outside the pie, then it must be grayscale. We
also manually check for brown/tan since that's a color in Bill's wardrobe that doesn't naturally occur
in the hue spectrum (hue is red to yellow, with a sat and val that ends up on the crust of the pie).

Finally, we send all of this info to the eink display, which is only updated when the button is pressed
to prevent it from constantly refreshing, which is bad for the lifespan. We also have a light switch to
help control the power use if he wants to leave the power on for a while.

pinouts reference: https://learn.adafruit.com/adafruit-2-13-eink-display-breakouts-and-featherwings/pinouts

    Name     | Arduino pin |       Sensor Pin      | description
 ------------|-------------|-----------------------|----------------------------------------------------
  EPD_CS     |      9      |  ECS on EINK          | E-Ink Chip Select, required for controlling the display
  EPD_DC     |      10     |  EDC on EINK          | Data/Command pin, required for controlling the display
  SRAM_CS    |      6      |  SRCS on EINK         | SRAM Chip Select, required for communicating with the onboard RAM chip.
  EPD_RESET  |      8      |  RST on EINK          | This is the E-Ink Reset pin, can set to -1 and share with microcontroller Reset
  EPD_BUSY   |      7      |  BUSY on EINK         | this is the e-Ink busy detect pin, and is optional if you don't want to connect the pin
  BUT        |      4      |  Push Button          | Used to tell the screen when to display a color
  LIGHT      |      A0     |  LIGHT on RGB Sensor  | Used to be able to switch this light on/off to conserve power
  SWITCH     |      A2     |  Slide Switch         | Used to tell the light when to turn on/off
  SCK        |      15     |  SCK on EINK          | SPI Clock Pin required for EINK and SRAM
  MOSI       |      14     |  MOSI on EINK         | SPI Microcontroller Out Serial In pin, it is used to send data to SRAM and e-Ink display
  MISO       |      16     |  MISO on EINK         | SPI Microcontroller In Serial Out pin, its used for the SRAM
  SDA        |      2      |  SDA on RGB Sensor    | Used to provide SDA connection to Adafruit Color Sensor
  SCL        |      3      |  SCL on RGB Sensor    | Used to provide SCL connection to Adafruit Color Sensor


documentation/resources referenced:
https://www.geeksforgeeks.org/program-change-rgb-color-model-hsv-color-model/?ref=gcse
https://learn.adafruit.com/adafruit-gfx-graphics-library
https://learn.adafruit.com/adafruit-eink-display-breakouts/overview
https://learn.adafruit.com/adafruit-color-sensors/library-reference
*/

#include <Wire.h>
#include "Adafruit_ThinkInk.h"
#include "Adafruit_TCS34725.h"
#include "Adafruit_GFX.h"
#include "Fonts/FreeSans18pt7b.h"

#define EPD_CS 9
#define EPD_DC 10
#define SRAM_CS 6
#define EPD_RESET 8
#define EPD_BUSY 7
#define BUT 4
#define LIGHT A0
#define SWITCH A2

//Makes it so the screen only tried to update once everytime the button is pressed
bool buttonOnce = false; 

ThinkInk_213_Mono_BN display(EPD_DC, EPD_RESET, EPD_CS, SRAM_CS, EPD_BUSY);
Adafruit_TCS34725 tcs = Adafruit_TCS34725(0x00, TCS34725_GAIN_16X);
//700ms integration time, gain <16x (sensitivity of sensor, adjust to enviro)


void setup() {

  pinMode(BUT, INPUT);
  pinMode(LIGHT, OUTPUT);
  Serial.begin(115200);
  while (!Serial) {
    delay(10);
  }
  //Set up EINK display
  display.begin(THINKINK_MONO);
  if (tcs.begin()) {
    Serial.println("Found");
  } else {
    Serial.println("No TCS34725 found ... check your connections");
    while (true)
      ;
  }
 
  display.setTextColor(EPD_WHITE);
  display.setFont(&FreeSans18pt7b);
  digitalWrite(LIGHT, HIGH);
}

void loop() {
  display.clearBuffer();
  
  //Set up color Sensor
  uint16_t r, g, b, c, colorTemp, lux;
  tcs.getRawData(&r, &g, &b, &c);
  colorTemp = tcs.calculateColorTemperature_dn40(r, g, b, c);
  lux = tcs.calculateLux(r, g, b);

  r = r >> 8; //Convert rgb down to 8 bits in binary
  g = g >> 8;
  b = b >> 8;
  rgb_to_hsv(r, g, b);
}

//convert rgb to hsv, calls helper fxn to compute color and adjectives to display, along w rgb
void rgb_to_hsv(uint16_t r2, uint16_t g2, uint16_t b2) {

  double r = r2 / 255.0;
  double g = g2 / 255.0;
  double b = b2 / 255.0;

  // h, s, v = hue, saturation, value
  double cmax = max(r, max(g, b));  // maximum of r, g, b
  double cmin = min(r, min(g, b));  // minimum of r, g, b
  double diff = cmax - cmin;        // diff of cmax and cmin.
  double h = -1, s = -1;

  // if cmax and cmax are equal then h = 0
  if (cmax == cmin) {
    h = 0;
  }

  // if cmax equal r then compute h
  else if (cmax == r) {
    h = fmod(60 * ((g - b) / diff) + 360, 360);
  }

  // if cmax equal g then compute h
  else if (cmax == g) {
    h = fmod(60 * ((b - r) / diff) + 120, 360);
  }

  // if cmax equal b then compute h
  else if (cmax == b) {
    h = fmod(60 * ((r - g) / diff) + 240, 360);
  }

  // if cmax equal zero
  if (cmax == 0) {
    s = 0;
  } else {
    s = (diff / cmax) * 100;
  }
  // compute v
  double v = cmax * 100;

  if (digitalRead(SWITCH))
  {
    digitalWrite(LIGHT, HIGH);
  }
  else
  {
    digitalWrite(LIGHT, LOW);
  }

  while (digitalRead(BUT)) {
    buttonOnce = true;
  }
  if (buttonOnce) {
    display.fillScreen(EPD_BLACK);
    String sn = checkSatVal(s + .5, v + .5);  //crude rounding function
    display.setCursor(0, display.height()/2 - 10);  //have more space default 1 line
    if (sn == "g")
    {
      display.println("grayscale");
    }
    else
    {
      String brown = checkBrown(h + .5, s + .5, v + .5);
      String color = pickColorHue(h + .5);
      
      if (brown.length() + sn.length() + color.length() <= 16)  //max num char per line
      {
        display.print(sn);
        display.print(brown);
      }
      else  //reformat to be 2 lines (dont overflow characters)
      {
        display.setCursor(0, display.height()/3 - 10);  //top left corner ish
        display.print(sn);
        display.println(brown);
        display.setCursor(0, display.getCursorY() - 5);
      }
      display.println(color);
    }

    //rgb display under the color name
    display.setCursor(0, (display.height() + 100) / 2);
    display.print("(");
    display.print(r2);
    display.print(", ");
    display.print(g2);
    display.print(", ");
    display.print(b2);
    display.print(")");
    display.display();
    buttonOnce = false;
    delay(5000);  //prevent fast button presses to protect screen life
  }
}

//determine "base" color
String pickColorHue(int hue) {
  if (hue >= 0 && hue <= 9) {
    return ("pink");
  } else if (hue <= 30) {
    return ("orange");
  } else if (hue <= 70) {
    return ("yellow");
  } else if (hue <= 160) {
    return ("green");
  } else if (hue <= 200) {
    return ("teal");
  } else if (hue <= 255) {
    return ("blue");
  } else if (hue <= 300) {
    return ("purple");
  } else if (hue <= 340) {
    return ("pink");
  } else if (hue <= 350) {
    return ("magenta");
  } else if (hue <= 360) {
    return ("red");
  } else {
    return ("");  //wont happen bc hue range 0 to 360
  }
}

//check for brown specifically bc not in main colors of hue spectrum 
String checkBrown(int hue, byte sat, byte v) {
  if ((0 <= hue) && (hue <= 50)) {
    //brownish area, also check for tan area
    byte tanRad = 10;  //determine radius visually online
    //core tan color: (34, 33, 82)
    if ((24 <= hue) && (hue <= 44))  //hue is within +/-10 of tan hue
    {
      //circle around core color
      int x = sat - 33;  //saturation = x
      int y = v - 82;    //val = y
      byte r = sqrt(pow(x, 2) + pow(y, 2));
      if (r <= tanRad) {
        return "tan ";
      }  //otherwise, fall thru to normal brown case
    }

    byte lowRad = 60; //boundaries for the crust of pie
    byte highRad = 75;
    int x = sat - 100;  //saturation = x
    int y = v - 100;    //val = y
    byte r = sqrt(pow(x, 2) + pow(y, 2));
    if ((lowRad <= r) && (r <= highRad))  // in "crust" of pie
    {
      return "brown ";
    } else {
      return "";
    }
  } else {
    return "";
  }
}

//determine light/dark/grayscale for color
String checkSatVal(byte sat, byte v) {
  int circleEq = pow((sat - 100), 2) + pow((v - 100), 2);

  byte rad = 98;  //inside pie radius

  if (circleEq <= (pow(rad, 2))) {
    int x = sat - 100;  //saturation = x
    int y = v - 100;    //val = y
    double angle = atan2(y, x);
    byte r = sqrt(pow(x, 2) + pow(y, 2));
    //edge case: not in a pi slice if in the corner
    if ((r == 0) || ((-7 * M_PI / 8 <= angle) && (angle <= -5 * M_PI / 8))) return "";  //just pure color
    if (angle < -7 * M_PI / 8)                                                          // between -pi and -7pi/8
    {
      return "light ";
    } else  // between -5pi/8 and -pi/2
    {
      return "dark ";
    }
  } else {  //outside of pie: grayscale

    return "g";  //flag to remove for display: dont put adjectives for white/gray/black
  }
}

 

Link for Files:

DXF files for the box and 3D model for the sensor housing: Files

]]>
Emotional Display by Team Vela: Final Documentation https://courses.ideate.cmu.edu/60-223/f2022/work/emotional-display-by-team-vela-final-documentation/ Mon, 12 Dec 2022 09:05:35 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16856 In this project, we as a team worked together to create a device to improve the life of a physically disabled person living in Pittsburgh. Crucially, the device is tailor made to our client, useful and relevant for them in particular, driven by their wants and needs, and nobody else’s. Designing over the course of seven weeks with our client Dennis, we conducted a needfinding interview, distilling into a concept for his device. (The notes from that interview can be found here: https://courses.ideate.cmu.edu/60-223/f2022/work/team-vela/)

 

What We Built

We created a light-up emotional display, allowing Dennis to show 5 different emotions ranging from happy to sad. These emotions correspond to five distinct light colors, from green (happy) to red (sad). These emotions and lights are controlled via a dial on an accessible control panel. Additionally, if Dennis has a question, he can press a button on his control panel, turning the lights purple. Finally, if Dennis has an emergency and needs to get someone’s attention, he can flip a two-part switch to turn the lights red and play a noise. 

Final prototype with face display and control box.

Side profile of face plate, showing the three acrylic layers, screw attachments, LEDs, and wiring.

Heat shrink-wrapped wires and exit hole from the back of the face plate.

Control box. From top to bottom: red emergency switch and dial for noise volume, blue button (toggling the question mode) and on/off switch, yellow button (toggling emotions mode) and dial for emotional status.

Emergency switch with the cover pushed back, primed to be flipped.

Textural “breadboard” feature, as requested by Dennis.

Close-up of emotions mode toggle button and emotional status dial.

Narrative Sketch

Dennis is out to lunch in Shadyside with some of his friends and staff from CLASS. His company is good, the food tastes great, the weather is nice, and he is having an excellent time. So, Dennis flips the on switch to his emotions display and turns the dial to show to others that he is happy being out to lunch with everyone. In addition to informing the people with him, the bright green lights and smiling face clearly display positive emotion, encouraging the strangers at the restaurant to approach Dennis to say hi—an opportunity to make new friends! After a great conversation, Dennis goes back to his lunch. 

Later, Dennis decides that his sandwich is a little bland and he would like to put some salt on it. Unfortunately, he is unable to reach across the long table to the salt shaker. Dennis presses the button on his control panel to indicate that he has a question. The display lights up purple and displays a neutral face. A light on his control panel displays this purple as well, informing the people at the table that Dennis would like some help. They fortunately notice it quite quickly. Dennis is able to ask for the salt for his sandwich, which is passed to him. 

By using the emotional face display, Dennis has been able to make himself both more approachable and better understood. Furthermore, it is able to ease some of this social frustrations by letting the people around him know he has an issue quickly and efficiently. 

 

How We Got Here (Prototype and Process)

This prototype was designed to help answer the design question: How can we help Dennis communicate with the world better?

Prototype

Our prototype was a light-up emotional display, capable of showing five different “emotions” through shades of color across three faces. Additionally, the display has two additional modes to the emotions mode: question, with an additional color, and emergency, which plays noise. 

Our single prototype was slowly constructed across prototyping its parts, eventually coming together into the final construction.

Sketch of control panel and other accessories.

Three of our cardboard cupholder prototypes.

First sketch of face plate model with potential placement of LEDs. These were later significantly reduced.

First laser cut of the control panel, with potential buttons and switches inserted.

Second laser cut of control panel, this time out of wood, with new buttons and switches inserted.

 

Measurements taken to plan integration and development of the cup holder.

 

Development process and testing of the wiring/circuitry.

 

Testing of soldered circuitry and integration with PCB board.

 

As we worked through our prototyping process, we wanted to be cognizant of creating a device that Dennis could use easily, taking into account his limited hand stability and mobility due to arthritis. This was particularly true with the control panel for the display. Although the less flashy part of the design, the control panel was also ultimately what Dennis would be experiencing the device through. If our panel couldn’t be used, neither could the device as a whole. 

In a subsequent meeting with Dennis to review our prototype, we came out with much helpful feedback. Dennis was happy with how the device was coming along overall and happy with the progress we had made thus far—but it was not perfect. By discussing with Dennis, we found that it’s easier for him to have larger buttons with slightly larger space in between those buttons. Tied to this, we found that it would be better with a slightly larger total control panel. Originally, we were trying to create the smallest possible panel in hopes that a smaller panel would impede Dennis’ and Dennis’ wheelchair as little as possible, but decided that Dennis’ comfort in using the panel was a very reasonable compromise for a slightly larger design. Importantly, Dennis was also able to quickly pick up on the interface we had designed. In terms of adjacent feedback, we asked Dennis about how he would like his device to appear visually. He was happy with our physical models, but did specifically request that his device was orange and potentially included some Steelers decals. This was inspired by another wheelchair user he had met who had pink accessories to match her pink wheelchair. Finally, we asked Bill, who helps to care for Dennis at CLASS if noise implementation for the “emergency” mode would become disturbing. He didn’t think so, so the noise translated into our final design. 

Overall, we incorporated all of the feedback we received from Dennis and Bill into our final prototype. The critique—both negative and positive—truly helped to inform our total design process and steer us in the right direction for our final implementation. Seeing the delight created by the device as a whole was a great driver for the next parts of the process. 

Process

A very stress-free process photo, including first iteration cardboard control board and Remy testing code.

Final 3D print of Dennis’ cup holder.

Our process, as many processes often are, was wrought with things not quite working out the way we were planning. We started by following the Gantt chart closely, but that quickly fell to the wayside—to our detriment. We were on track for the CAD designs and electronics programming, but fell behind beginning with the circuit build. 

Testing the LEDs in series before soldering them together.

Laser cutting the final control box. 

Attempting to fit all of the wiring into the final control box.

Dennis with his final emotional display (photo requested by Dennis).

To our surprise, the circuit build was one of the most challenging parts of the project. In particular, wiring together the individual LEDs for the smiles proved to be more intensive than we had originally thought. We had to plan in four individual circuits (the smile, neutral mouth, frown, and connecting middle section) as we weren’t exactly sure how to light up the individual LEDs, despite that they were addressable. Creating the four individual circuits was challenging. The first two attempts at wiring were faulty, so both had to be restarted. Then, once the circuits were finished being soldered together, they had to go into the face display, which presented a new set of problems. The LED series were incredibly finicky to work with. Often, the wrong series would light up or the lights would blink in multiple colors rather than staying the single color assigned to them in code. We figured out that a data-in line was somehow wired into a ground line group, which caused our initial round of issues. Additionally, we figured out that some of the pins of the LEDs were touching each other as they sat behind the middle faceplate, causing further disturbances within the feedback. These problems were never really even solved, the lights continuing to be finicky all the way through the presentation. 

With respect to the Gantt chart, the set back caused by the problems we had with the LED series pushed back other plans, throwing off the scheduling of the rest of the chart. The laser cutting, which was intended to finish six days before the final prototype presentation, was only finished two days before. Similarly, the final assembly of the device occurred the night before the final presentation rather than the intended two days prior. A large part of our lack of speed as a group was due to unbalanced experience. For many parts along the process we became reliant upon Remy, who has far more experience with physical production of electronics, to confirm details and tasks. This, in turn, interrupted Remy’s process, creating a positive feedback loop. 

Then, in a final blow for the presentation, we could not find any D batteries or AAA batteries to externally power our device. This means that we are still not sure if the device works purely off of external power, or for how long it is able to run off that external power. The process as a whole had many moments like this, where we really found ourselves at a loss. Yet despite our challenges and the final close call, the final product is one that we are proud of. 

 

Conclusions and Lessons Learned

In reflection of our journey through this project, a few things really stood out to us. 

Firstly, our final critique and feedback session had a lot of important thoughts to offer. While we were so focused on developing a product that would communicate to everyone around Dennis how he was feeling, our product was limited to giving a bold display to those behind him. While we had a small LED on the control panel that those talking with him could see, there was no larger display for those speaking or interacting with our client. One of the our critics summarized some of the general feedback for improvements quite well when that stated that the system could be improved with, “more animation, add a time-out feature, reduce size and use as necklace? Use LED strip instead of separate LEDs, brightness knob.” These features would greatly expand the functionality and usability of our product and would be our main additions for future iterations. Furthermore, a point was made that this device could be usable to any individual with some form of disability that inhibits their ability to communicate emotions, demonstrating the devices importance as a product on a grander scale. Beyond this, it seemed like our attention to detail in preserving the importance of aesthetics and cleanliness didn’t go unnoticed. The remaining feedback fell into one of the aforementioned groups of feedback, and the quote above contains elements many of the critics mentioned could be included. The advice to animate the eyes is a simple change that would not only add additional emotional complexity to the display but would also reduce power usage in our system (as it would require less total LEDs to be lit up).

Secondly, we learned a lot about creating bespoke technology, particularly for disabled people. In our initial meetings, we didn’t know how to be attentive to unspoken issues yet. For many disabled people, the core problems in their lives are beyond unsolvable for a couple students doing a semester project. More so, as even non-disabled people experience problems, they don’t think of them as problems or are unwilling to speak about things as problems. A person often doesn’t know their own issues and, if they do, to voice them is very heavy social weight. We all went into the project with the notion that our client would tell us about one of these problems, not considering that Dennis may not have problems or talk about those problems. We assumed that Dennis must have a problem that he would want fixed. So focused on trying to get Dennis to describe a problem in his life, we were barely open to offering potential problems that we saw ourselves in Dennis’ life. We are incredibly grateful to have worked with Dennis for this project and his patience with us in doing so. 

In conclusion, although our final product was far from perfect, we are proud of what we made. It was an interesting concept, decently executed given our collective restraints, and, most importantly to us, it was a product that Dennis was excited about. At the end of our final presentation, he asked if we could strap it on for him and there is no better affirmation than that. 

 

Technical

Schematic and Block Diagram

Circuit schematic for system.

 

Block Diagram of the system.

Code

/* 
 *  The code below is used to operate the emotional display board
 *  developed for 60-223: Intro. to Physical Computing. The wiring
 *  inputs/outputs are as shown below.
 *  
 *   Pin Map:
   Pin   |  role  |   Description
   ----------------------------------------
   A0    | input  | Input readings from pot 1. For light color
   A1    | input  | Input readings from pot 2. For speaker volume
   2     | output | Controls the sound sent to the speaker
   3     | output | Writes to the neopixel ring LEDs
   5     | input  | Detects digital signal from emergency switch
   6     | input  | Detects signal to activate emotion display, button 1
   7     | input  | Detect signal to activate question display, button 2
   8     | output | Writes to inner middle mouth LEDs
   9     | output | Writes to outer middle mouth LEDs
   10    | output | Writes to LEDs showing smile
   11    | output | Writes to LEDs showing frown
   12    | output | Writes to LED displaying emotion on control panel
   5V    | output | Power supply for LEDS and buttons
   3.3V  | output | Power for remaining components
   GND   | input  | Ground for all components
 */

#include <Adafruit_NeoPixel.h>

// Define all variables and constants
const int pot1 = A0;
const int pot2 = A1;

const int switch1 = 5;
const int but1 = 6;
const int but2 = 7;
const int neoRing1 = 3;
const int three_leds = 8;
const int four_leds = 9;
const int smile = 10;
const int frown = 11;
const int emots = 12;

const int speaker = 2;

int sensOne;
int sensTwo;

int pressOne;
int pressTwo;
int pressThree;

int level = 0;
int emotion = 0;

unsigned long globTime;
unsigned long beepTime;

bool beepOn = 0;

// Define LED strips
Adafruit_NeoPixel strip = Adafruit_NeoPixel(32, neoRing1, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels(3, three_leds, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels2(4, four_leds, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels3(10, smile, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels4(11, frown, NEO_GRB + NEO_KHZ800);
Adafruit_NeoPixel pixels5(1, emots, NEO_GRB + NEO_KHZ800);

void setup() {
  Serial.begin(9600);

  // Initialize LEDs and define pin modes
  strip.begin();
  strip.setBrightness(30); //adjust brightness here
  strip.show(); // Initialize all pixels to 'off'

  pinMode(but1, INPUT_PULLUP);
  pinMode(but2, INPUT_PULLUP);
  pinMode(switch1, INPUT_PULLUP);

  pinMode(pot1, INPUT);

  pixels.begin();
  pixels2.begin();
  pixels3.begin();
  pixels4.begin();
  pixels5.begin();
}

// Below function used to write to the LEDs for question face
void lightQuestion(){
  // Insert code to light question mark
  //colorWipe(strip_mid.Color(255, 0, 255), 50);
  for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(0, 255, 255));
      pixels.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 255, 255));
      pixels2.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
  }
  
  colorWipe(strip.Color(255, 0, 255), 50); // Purple
}

void colorWipe(uint32_t c, uint8_t wait) {
  for(uint16_t i=0; i<strip.numPixels(); i++) {
      strip.setPixelColor(i, c);
      strip.show();
      //delay(wait);
  }
}

// Below function used to write to the LEDs for emergency face
void lightExclam(){
  // Insert code to light exclamation mark
  //colorWipe(strip_mid.Color(255, 0, 0), 50);
  for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(0, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
  }

  for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 255, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
  }
  
  colorWipe(strip.Color(255, 0, 0), 50); // Red
}

// Initialize speaker and send sound signal
void initSound(){
  globTime = millis();

  if(abs(globTime - beepTime) >= 3000 && !beepOn){
    tone(speaker, 2000, 600);
    beepTime = globTime;
    beepOn = 1;
  }

  if(abs(globTime - beepTime) >= 1000 && beepOn){
    noTone(speaker);
    beepTime = globTime;
    beepOn = 0;
  }
}

// Level 0: emotional control lights
// Level 1: Question mark light
// Level 2: Exclamation mark and sound control
void loop() {
  if(level == 0){
    sensOne = analogRead(pot1);

    Serial.println(sensOne);

    emotion = map(sensOne, 0, 700, 0, 5);
    emotion = round(emotion);

    //Serial.println(emotion);
    
    lightEmotion();
    
    switchControls();
  }

  if(level == 1){
    lightQuestion();
    switchControls();
  }

  if(level == 2){
    lightExclam();
    // Might not need, just use potent as variable resistor to change volume instead of
    // doing in software.
    //sensTwo = analogRead(pot2);
    switchControls();
    initSound();
  }
}

// Below function used to read any changes in current mode of the system
void switchControls() {
  // insert detection for button presses to switch controls
  pressOne = digitalRead(but1);
  pressTwo = digitalRead(but2);
  pressThree = digitalRead(switch1);

  //Serial.println(level);

  if(!pressThree){
    level = 2;
  }

  if(!pressTwo && pressThree){
    level = 1;
  }

  if(!pressOne && pressThree){
    level = 0;
  }

  if(pressOne && pressTwo && pressThree && level == 2) {
    level = 0;
  }
}

// Below function used to write to the LEDs depending on current emotion
void lightEmotion() {
  if(emotion == 0){
    // light red angry
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(0, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 255, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
  
    colorWipe(strip.Color(255, 0, 0), 50); // Red
    pixels5.setPixelColor(0, pixels5.Color(0, 255, 0));
    pixels5.show();
  }

  if(emotion == 1){
    // light orange unhappy
    //colorWipe(strip_mid.Color(255, 55, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(55, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(55, 255, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(255, 55, 0), 50); // Orange
    pixels5.setPixelColor(0, pixels5.Color(55, 255, 0));
    pixels5.show();
  }

  if(emotion == 2){
    // light yellow neutral
    //colorWipe(strip_mid.Color(255, 255, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(255, 255, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(255, 255, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(0, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(255, 255, 0), 50); // Yellow
    pixels5.setPixelColor(0, pixels5.Color(255, 255, 0));
    pixels5.show();
  }

  if(emotion == 3){
    // light yellow-green happy
    //colorWipe(strip_mid.Color(100, 255, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(255, 100, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(255, 100, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(100, 255, 0), 50); // Yellow-Green
    pixels5.setPixelColor(0, pixels5.Color(255, 100, 0));
    pixels5.show();
  }

  if(emotion == 4){
    // light green very happy
    //colorWipe(strip_mid.Color(0, 255, 0), 50);
    for(int i=0; i<3; i++) { // For each pixel...

      pixels.setPixelColor(i, pixels.Color(255, 0, 0));
      pixels.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<4; i++) { // For each pixel...

      pixels2.setPixelColor(i, pixels2.Color(0, 0, 0));
      pixels2.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels3.setPixelColor(i, pixels3.Color(255, 0, 0));
      pixels3.show();   // Send the updated pixel colors to the hardware.
    }

    for(int i=0; i<6; i++) { // For each pixel...

      pixels4.setPixelColor(i, pixels4.Color(0, 0, 0));
      pixels4.show();   // Send the updated pixel colors to the hardware.
    }
    
    colorWipe(strip.Color(0, 255, 0), 50); // Green
    pixels5.setPixelColor(0, pixels5.Color(255, 0, 0));
    pixels5.show();
  }
}
]]>
Team Fornax: Interview with Jeff https://courses.ideate.cmu.edu/60-223/f2022/work/team-fornax-interview-with-jeff/ Tue, 08 Nov 2022 13:26:23 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16757 Introduction:

This interview was conducted by team Fornax, which includes Ethan, Frances, and Gia. We interviewed Jeff on Tuesday November 1st, at 6pm. Our team went into the meeting with the intention of learning about Jeff’s daily struggles effected by his disability, and what he wants to add in his life. However, we also wanted to let Jeff know our intentions and limitations, and make sure he didn’t have any questions. Our hope was that after the meeting we would have a few ideas on how what kind of assistive gadget we could create to make Jeff’s life better or more convenient.

Agenda:

Before the interview we wrote out a bunch of questions and follow up questions that we planned to run through. We didn’t plan to use it as an exact format where we went down the list. It was more so just to make sure we had enough questions to get a solid interview. I think we went through almost every question.

We also wrote up an outline for an introduction to make things clear with Jeff so he knows where we are coming from. It was mostly based on the outline Zach gave us.

Here were the list of questions:

Say: our names – our backgrounds

Ask client to introduce themselves: name, interest in participating in this

Say project goals:

  • Trying to build prototype useful devices (making a convenience machine/ gadget)
  • Engaging in an iterative design process, including gathering formative feedback around the midpoint of the process
  • Taking about seven weeks to go from this meeting to a reasonably high-fidelity final product (explain timeline)

We are not

  • Professional technologists who are experienced in making polished products
  • Planning to build something that will be sold commercially
  • Constrained by any practicality outside of usefulness to the person we’re designing for
  • Likely to invent a totally novel piece of electronics (say limits of Arduino)

Ask client if they have questions about process

Interview Questions (follow up questions in bullet points)

Are there any Daily life activities that are frustrating/difficult? You can list it or go into detail about a few?

  • Can you group your problems into a couple big categories
  • Can you demonstrate it?
  • What makes it difficult?
  • Is there a product you think that can help mediate this?

Is there something you use to enjoy doing that has become harder over time?

    • Is there any creative ideas you have to help this issue
    • Is there a way to make it somewhat easier again
    • What would encourage you to do this activity again?

Could you narrate or draw your daily life?

      • What’s the typical routine?
      • Common emotions throughout the day?
      • What do you look forward too?
      • What do you not look forward too?

What do you care about?

  • Why is it important to you?
  • How would you want to implement it more into your life?
  • What ways can you think of that would bring this into your life more?

What do you need that maybe you don’t always get?

  • What hinders you from this

How would you improve your resources and tools now?

  • Ways to add on or subtract?
  • Is a completely new device needed? Or add on to what you have now?

Tell us some story about inconveniences that come up in your life?

Meeting Summary and Major Takeaways:

Before we could design a unique device for Jeff, it was important for us to understand Jeff’s hobbies and interests. The meeting started with us asking Jeff about his daily routine. Jeff highlighted that his favorite parts of the day are playing video games with his best friend, recording and watching YouTube, and sitting on back porch. Jeff informed us that he decided to participate in this project because he: “likes the invention of new technology”. After getting to know Jeff a bit more, we then began to explore possible avenues in which a device could assist Jeff in his daily life.

At an earlier point, Jeff mentioned that he works as a receptionist at CLASS for a couple days of a week. We asked if there were any little annoyances that he may experience due to being in a wheelchair. Jeff told us his only pet peeve is wh

en the receptionist working the previous shift doesn’t cleanup after themselves. We decides as a group, that this in an avenue we don’t want to further explore. Naturally we then talked more about Jeff’s life at home. We learned that Jeff has a two-year old nephew, who occasionally leaves their toys out on the floor. We asked Jeff he deals with this situation, he told us that he would always get someone else to clean the toys up. This striked us as a possible avenue that we could explore. We also learned that Jeff sits on his back porch because his front porch is at a slight angle. Jeff emphasized to us that he is always nervous about being on the front porch without someone behind him. He said if he would feel more comfortable if he knew that he would be okay by himself.

Quick sketch of wedges elevating a wheelchair

Towards the end of the hour, we talked to Jeff about his YouTube channel. We learned that Jeff likes to make reviews about music albums. We asked Jeff about how he records his videos. We discovered that his setup was very minimalist, he simply records only using his computer in one take.

 

Quick sketch of the Toy Sweeper

Thinking of ways to create the toy sweeper

Post Interview Thoughts: 

Our interview did not quite go how we expected. Jeff was quite satisfied with how his life functioned and it was a bit difficult for us to ideate some ideas for how to improve his experience. It was difficult to get the conversation started, but slowly but surely, we discovered more and more about Jeff’s life. The agenda we had come up with was definitely not adhered to, as we came up with a lot of follow up questions that we didn’t ever get to. 

I think part of the reason that our interview was not quite as successful as we wanted it to be was that we came into the interview assuming that Jeff would have suggestions on what he thought might be a good gadget. Though he was very receptive with ideas that we suggested, it was difficult to get him to complain about his own life. 

In hindsight, our team wished that we asked a little more about the details of the wheelchair itself, as a lot of our ideas were built on improving the wheelchair. We could have asked about the dimensions or the braking system, which might have been good jumping off points for possible ideas. The ideas that we came up with were quite mechanical, and were grasps on the parts of Jeff’s life that we thought we could ease even more, but not necessarily the most achievable and reasonable ideas. 

That being said, I think our interview served its purpose. We got to learn more about the life of Jeff, and we came up with several possible project ideas that we can build on. 

]]>
Team Andromeda: Meeting with Teri Owens https://courses.ideate.cmu.edu/60-223/f2022/work/team-andromeda-meeting-with-teri-owens/ Tue, 08 Nov 2022 00:40:08 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16754 Introduction

We are Team Andromeda, made up of members Francesca Menendez, Ethan Hu, and Sharon Li. For our final project, we are working with Teri, a person with cerebral palsy who communicates through a digital tablet with a voice assistant, to make her a device that can assist her in some of her everyday activities. Our interview with Teri was held on November 1st, 2022, at 9:30 AM at the CLASS (Community Living And Support Services) center in Pittsburgh, PA. We loved getting to talk to Teri as she was incredibly helpful in the interview process and had lots of ideas of devices that could help her, which made the interview process very interactive.

Teri’s Daily Activities and Beginning to Form Ideas

More Ideation

Meeting Agenda

  • Introductions
    • Go around to give a brief introduction of each group member and give some insight as to who we are in terms of our name, major, interest in the project, and additional information we would like to share about ourselves, such as our past projects in this class.
    • Have our client, Teri, give a similar introduction about herself, including insight into her motivations for volunteering for her project. 
  • Explanation and Clarifications 
    • Share the course website to relay information about the time frame of our course project. 
      • An important date to mention, is November 17th, 2022, when our prototype is due, and they are invited to come for the critique, where we can receive direct feedback from them. 
    • Give a brief overview of the project and explain clearly the objectives of this project and clarify what we are trying to achieve and what we are not trying to do as outlined in the course website notes.
      • We are:
        • Trying to build prototype useful devices
        • Engaging in an iterative design process, including gathering formative feedback around the midpoint of the process
        • Taking about seven weeks to go from this meeting to a reasonably high-fidelity final product
        • Documenting our process
      • We are not:
        • Professional technologists who are experienced in making polished products
        • Planning to build something that will be sold commercially
        • Constrained by any practicality outside of usefulness to the person we’re designing for
        • Likely to invent a totally novel piece of electronics (we combine many existing available components in new ways, but don’t make components)
    • Ask if there are any questions or clarifications needed to be made about this project throughout this portion of this meeting. 
  • Understanding Needs
    • Plan to keep this part more flexible, see how it goes depending on the conversation, and ask guiding questions based on this. The goal is trying to devise possible technological interventions that could aid specific problems or be an addition to her life. 
    • Asking various questions to Teri that could lead to potential discussions about areas of her life such as:
      • See if there are any ideas she has thought of beforehand.
      • Sharing her daily routine. Is there anything that is frustrating for you that you would like to have some help with? 
      • How have you currently worked around some of your problems?
      • Can you demonstrate any of the actions for us or share photos of assistive devices that help you already?
      • Something that you used to enjoy doing that has become harder to do? What are some of your hobbies?
    • Share some possible ideas we could do and mock-up solutions. 
    • Discuss and establish some categories of “problems” or areas we could help in and narrow the conversation towards that.
  • Conclusion
    • Thank Teri, for her time, for allowing us to talk to her more in-depth about this project, and for inviting us to CLASS. 
    • Ensuring we have each other’s contact information. 
    • Take pictures of anything necessary or relevant to our project and a group photo.
    • Reiterating the overall project schedule, specifically where we will be making substantial progress, such as our project ideation and prototype critique.

Meeting Summary and Major Takeaways

Through our meeting with Teri, we learned a lot about her daily activities and some of the hobbies she dabbles in, which helped us steer our ideas toward something that might help add to one of these processes. From the get-go, she was very helpful in describing what are some things that could be improved in her life – we got the impression that she would like a device that allows her to do more things by herself. Because she cannot do some things herself currently, she brought up some things that might help her to become more independent – having a device that would allow her to strap into/take off her foot straps when she pleases was a great idea she contributed. 

Teri’s current foot strap, which is quite loose

Another thing she explained to us was that she struggles with drinking. As part of her cerebral palsy, she cannot suck using a straw or hold a cup up to her mouth. This brought us to another idea, using a robotic arm to allow her to be able to drink a cup of water by herself. However, with the project’s time frame, making sure a robotic arm wouldn’t spill water all over her in the process would be difficult, so we turned our attention to her hobbies to see if there was anything we could do to help her.

George and Teri demonstrating how the tablet is attached and can be unattached from her wheelchair.

After her father, who was present for some of the interviews, mentioned her interest in music, she told us about her weekly music rehearsals using accessible musical instruments with others. We learned that one of the main instruments she uses is simple – a tambourine attached to her leg that she can shake to make sounds. This brought us to our idea of making her an accessible musical instrument that would make music based on the position of her feet on her chair. Our meeting with Teri opened up a lot of pathways for what we could possibly do for this project to assist her in some way, and the ideas we came up with together look to help her become more independent and make her daily activities more enjoyable.

Sketches Developed After Our Interview

Thoughts After Holding the Meeting and Discussing

Our meeting with Teri was awesome, she was eager to talk and interact with us, considering she had some ideas already prepared beforehand. Since we have a relatively loose agenda, everything goes according to plan, if not better. Initially entering the meeting, we were concerned that we would have trouble trying to find what aspects of her life we would be able to build a gadget for her to somehow improve her life. 

However, Teri was very interested in our project and offered us several ideas that we can expand on. It was relatively easy to go from topic to topic and try to ask follow-up questions that could possibly segway into some areas of her life we could help her with and be able to learn more about Teri and her likes and dislikes. It was super nice to get to know her more and try to figure out what she would enjoy in terms of what gadgets we could make for her. Generally, we would not have anything we would want to do differently next time in terms of asking her questions or any questions we wish we would have asked. One thing we did wish to do was probably get a physical measurement of parts of her wheelchair beforehand so it would help us along the way later on, as all of our designs have some connection to her wheelchair. Overall, we all agree that our meeting with Teri went very well. We have gotten a lot of great ideas to work with and understand Teri’s lifestyle with her disability. 

Group Picture!

]]>
Team Vela https://courses.ideate.cmu.edu/60-223/f2022/work/team-vela/ Mon, 07 Nov 2022 02:27:50 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16720

Introduction

Our meeting with Dennis sought to explore some difficulties that Dennis has in his life—difficulties that we might be able to help overcome with the creation of a bespoke device. Dennis is a 67 year-old disabled man, facing both the challenges of disability and old age, including being wheelchair bound. The information that we gather through this need finding interview will hopefully guide our functioning device to effectively assist Dennis in his daily routine, driven solely by his wants and needs. At this meeting was Dennis, one of his caretakers, Bill, Elise, Harry, and Remy. The meeting was held at the CLASS facilities in Pittsburgh on Tuesday, November 1.  

 

Agenda

Prior to this meeting, our group decided on the following subjects to complete during the interview: we wanted to first learn about Dennis (i.e. age, disability, interests, passions), we wanted to then transition to introducing our project to him to give him insight as to what we hoped to accomplish, and then we were to brainstorm ideas as a collective with regards to improving his quality of life. We decided to stray away from a more strict agenda for the meeting and instead focus on the quality of response.

 

Meeting Summary and Takeaways

Notes from the interview

Our meeting with Dennis went fairly well. We began by asking Dennis about his interests, which notably included: going out to eat, watching sports games like the Pittsburgh Pirates and the Steelers, playing “Candy Crush” (which we later found out was Angry Birds), soda-pop, and art. This discussion naturally led into what gives Dennis trouble in doing these activities and enjoying these things. Dennis, besides being wheelchair bound, suffers from a lot of shakiness in his hands; Bill believes this to be symptoms of arthritis. For example, for Dennis to do art, he must use a “hand-on-hand” method, where he will use another person’s hand as stability on the tool, but will guide their hand with his. This shakiness also manifests as pain when Dennis tries to grip things or press small buttons, making things like using a remote or a video game controller difficult. 

Dennis struggling with computer mouse

 

Moving forward, we asked Dennis how his typical day goes. Dennis likes to try to make his bed, which he finds to be difficult. Dennis will also do “school work” and work at the “pop machine”. He very much enjoys soda-pop and wishes he could pour it himself. Finally, at meal times, Dennis tends to be a bit impatient with his caretakers and likes to eat his food quickly, but is limited in how much he can help around the kitchen. 

Overall our meeting went as well as possible despite communication issues with Dennis and we were able to identify various struggles and challenges that he faces on a daily basis. Bill, his caretaker, also gave us good insight on Dennis’ habits and his overall behaviors. It was apparent that he was well adapted to his limitations and has a lot of solutions to the challenges he faces. 

 

Post Interview Thoughts and Discussion

This interview left us, as a team, a bit confused on how to approach the few complexities that Dennis presented to us. The conversation with him did involve a lot of backtracking and repetition of discussing certain topics; however, we gained true insight into what matters to and are the passions of Dennis. While not as smooth or concrete as we were hoping, the meeting gave a lot of insight into who Dennis is as a person (e.g., an independent, video gaming, soda loving human being). With this being said, the challenges within Dennis’ life that we could obtain are difficult to apply toward a project within physical computing. One of Dennis’ main issues is that his hands are quite shaky and gripping objects could be either difficult or painful due to aging. These issues are inherently mechanical, leaving us as a team a bit perplexed as to what project to create for him. We believed the discussion was overall a success, but we did wish to have more concreteness with regards to the struggles Dennis faces.

]]>
Team Lacerta: Meeting with Bill https://courses.ideate.cmu.edu/60-223/f2022/work/color-sensor/ Thu, 03 Nov 2022 14:21:50 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16716 Introductions

Our goal for this project is to design a useful device for someone with a disability. The first step would be to interview our client so we can have a good idea of what challenges he faces and what things he would find useful for his everyday life. On Tuesday, November 1st, our team (Sarah, Freda, Jonathan) met with Bill, who has achromatopsia, so he is legally blind and 80% color blind.

Agenda

Intro:

  • Name
  • Where you are from
  • Something that reflects personal values
  • Interests
  • How long have you been in PA
  • Talk a little about our past projects (project 2)
  • What drew you to participate in this project? How did you hear about this project?
  • Is it ok to record this?

Clarify scope and expectations:

  • We are not professional developers
  • Not making something to be commercially sold
  • Our goal is to make a gadget that is helpful for you
  • Go over our timeline
  • Ask if there are any questions/confusion

Questions:

  • What do you struggle with in everyday life
  • Is there anything you wish was easier to do that you have challenges with
  • Was there anything that you haven’t been able to do that we could potentially create something for
  • Do you have challenges with running (His sport)
  • Ask follow up questions as we go on his responses
  • Write down ideas 

Conclude:

  • Thank him for his time

 

Summary and Major Takeaways

Notes/Ideation during meeting

Due to Bill’s achromatopsia he has a lot of trouble distinguishing between colors. Especially blue and white, red and black, and green in general. This causes some challenges when navigating through websites as some functions such as embedded links will be blue on white. The first time he uses a website, he clicks on all the links he can find by seeing when the mouse cursor changes into a hand symbol, and memorizes where on the website it is and where the link leads. Of course, this method may miss some links if the area to click on is small and the cursor doesn’t change. Therefore, we thought it would be useful for Bill if we could make a gadget that checks for the presence of embedded link on the website and locates where they are.

Link Detector (ideation after meeting)

For the same reason, he finds it hard to pick matching clothes in the morning. He mainly guesses the color of the clothing from his past experiences and what colors are more likely to occur in clothing, but sometimes he ends up with mismatching colors, which he finds embarrassing. Therefore, we wanted to address these two points of struggle by creating a gadget that detects and prints out the color of the clothing using a sensor and colors on the website using a computer app. 

Color Reader (ideation after meeting)

Lastly, when Bill cooks on his own, he cannot tell when meat is fully cooked, so he usually ends up burning the food. He tried to use a thermometer but found it to be inconsistent. Using a timer wasn’t too helpful either because timing depended on the type and shape of ingredients and the strength of the heat.

Notes/Ideation during meeting

We found this one to be challenging because Bill has already tried to problem solve and they did not help him at all. So, we ended up with a less feasible idea of using machine learning and training a model to identify if the food is done or not.

Helpful Cook (ideation after meeting)

 

Reflection

Zoom meeting with Bill ( with audio connection only)

Overall we thought the meeting went successful. Bill was a very nice and active guy. He was open to any and all ideas we had and from what we understand just wanted to see how this process is so he can recommend other people in future years for this class. His main goal in participating in the project was to get disability awareness more widespread and help the younger generation in seeing that people with disabilities are no different from everybody else more so than anything else.

Our first challenge was we were originally planning to do drawings on Zoom, but since he called in on a phone instead of a computer, he wasn’t able to see the zoom so we couldn’t show him any drawings or images. If we were able to change anything about this meeting (and for future meetings), we would have liked it to be in person so we could draw together and receive more immediate feedback vs just based on what he says/describes. However, at the moment Zoom was the best solution since Bill was helping other people with their meetings. 

Our second challenge was that Bill is very high-functioning in his everyday life. He lives by himself, works helping others with disabilities, cooks for himself…etc. This made it difficult to find things he struggled with day to day. He was born with this vision, so he had learned what issues he could work around. After our meeting the only solutions we could come up with were issues related to how he couldn’t see colors. However, we think the ideation has enough variation in what kinds of color-related issues it addresses. Overall, we were able to know a lot more about Bill through our meeting, and hopefully, he will find the gadget we create to be helpful.

]]>
“Self-Watering” System https://courses.ideate.cmu.edu/60-223/f2022/work/self-watering-system/ Mon, 31 Oct 2022 06:24:33 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16686

 

“Self-Watering” System

Elise Chapman

 

Simple Description

This system rewards the user for “watering” themselves (pouring a glass of water) with a sweet, printed poem. This encourages the user to be cared for both mentally and physically.

 

Overview Images

Printing and taking a poem

Overview photo

Tech insert details

Where to plug in external power for printer

Output of printed “poem”

 

Process Images and Review

Initial drawings of concepts for project 2

Beginning my project, I knew I wanted to do something that was a little more abstract, but mostly was also mostly a vehicle for me to design something interesting. My goal was to do something that would be more challenging with physical build and hardware and not so challenging with software, to reflect my personal abilities. After some discussion with Zach, I decided to go with the poem printer. I thought that it would be something different, but still fun! The only thing I changed from my initial concept was to do the poem selection based on the weight class of a glass of water. This was a way for me to include another physical input form that I hadn’t tried before. Buttons are fine but I wanted to challenge myself a little more.

Creation thoughts and sketches, as well as some measurements for laser cutting

On the design concept side, I found a lot of my inspiration through just looking through the scrap pile in the laser cutting room of ideate. There was this great mirrored acrylic scrap that there was enough to work with, so I grabbed it! I had the idea to make the mirrored material to look like a puddle to reflect (no pun intended) the glass of water that was going to go in as a part of the build. I think this style of design turned out well and the final build feels polished, largely in thanks to the mirrored acrylic.

Freshly laser-cut mirrored acrylic, before I inserted the printer, speaker, and switch, and put it on its base

Two major points of my project were getting the load cell working and getting the printer to print poems out of an array.

beginning to work w load cell

Beginning of working with the load cell

Firstly, getting the load cell to work within my final model was quite tricky. Getting it to work in general was fine, but that was in the context of standard tables and the lab, aided with clamps. That came back to bite me later as I was trying to insert the load cell into my final build. I assumed that I would be able to duct tape the load cell to the lid of my build, therefore creating enough tension for the load cell to function. In practice, however, that was not the case. I duct taped it in, but it was not calibrating correctly and not calibrating consistently.

Some of the strange outputs I was getting while trying to calibrate a non-clamped load cell

I then realized that I needed to provide more support to the one side of the load cell. So, I tried to add in vertical supports underneath by cutting and propping up some of the eighth-inch laser cutting wood. That didn’t really work. So, I decided that what I really needed was to recreate the clamp sensation within my build. So, I hauled myself to the design product studio to find blocks of wood and screws. I created a wooden base, which I screwed into the base of the build and then I screwed the load cell into that. This, finally, was enough to recreate the tension support of using a clamp on the load cell bar.

My triumph of getting the load cell to work

Then, my second challenge, which I still have not overcome: the printing poems out of an array. Originally, the problem was that I had very little to paper to test on, so I did most of my initial programming with Serial print in mind. However, this part of the problem was resolved when the printer paper came in.

The more difficult and ongoing issue for printing the poems is memory. Unfortunately for me, the Arduino has very little memory, so even something as small as a haiku takes up far more memory than that little thing has. Therefore, when trying to print poems out of an array, I simply can’t. For reasons I don’t know, even trying to print single words out of an array seemed to be posing an issue for my system. The printer can print cohesivelyIt’s rather frustrating because printing the poem is the crux of my device—without it, it seems to do nothing at all besides make some noise. Sometimes, my printer will print seemingly random characters, ones that aren’t even a part of the poems; meaning that it’s reading something out of the arrays, just I’m not sure what. All in all, it’s something I am still working to fix.

Some code from my most recent help session with Zach

To try and remedy this I have met with Zach two times now, to very little success. We added in an microSD card in hopes that by having the Arduino pull from text files, it will bypass the memory issue. So far, we have been able to pull information out of the text files on the microSD card, but transferring that information to the printer still seems to be an issue. The last we left off, the printer could print parts of my poems, but not the entire poem. Once again, it’s following some form of logic that I personally can’t follow. Although this project is over for the class, I plan to continue working on this project. I really want to see it through. Despite how frustrated it makes me, I still think the concept is interesting and I want to have a piece that I can show the world.

 

Discussion

This has been quite a difficult project for me. I think what has proved quite a challenge has been the things I did not expect to be a challenge at all, doubly so because they seemed to appear last-minute, but not have been easy to solve at all. 

Some things, of course, did go easily: like the design. I’m happy with how the design turned out visually. It looks clean but still interesting and I think that people respond to the mirror acrylic in a way I hoped they would. I wanted to create something playful but still sophisticated. For example, in the in-class crit, I got good positive feedback including “very aesthetically pleasing”, “the design was well executed”, and “the laser cut mirror acrylic panels on the top were really effective”. My design was the biggest thing I felt confident about and it was nice to see that reflected. Also importantly to me, people in the in-class crit seemed to understand my intent with the project, despite its flaws. My favorite comment was: “I love the meaning and thought behind the project. I also love that the poem is printed and feels more meaningful than just seeing it printed on the screen”. I try to incorporate a materiality of being into my work, making things where you walk away with something, tangible or not. As I often describe it, I like making my work something to be existed with, not just tangentially to. It made me feel proud that others can see my work in the same way that I see my work. As a designer, I often worry about communicating the things in the way I intend to, so especially on abstract projects like this one I am not sure my message is getting through. But it really seems like it is, and I’m glad for it. 

But on this project’s flaws, it has one main large one: The poems will not print properly. Like I mentioned before, this is incredibly frustrating because the poem printing is sort of the crux of the machine, it’s sort of its point. I didn’t even expect the poem printing to be an issue. I didn’t know about the Arduino’s limited memory going into the project, so I simply assumed that pulling out an array of strings would be easy. I’ve coded with arrays before so it felt like familiar territory. In fact, I assumed that the harder part would be to get the printer printing in general. Not so much. It really is something that I’m still working towards fixing and I do want it fixed eventually. I don’t think I’ll really step back from this project until it feels like a total dead end. 

Overall, I am so-so on this project. On one hand, I am happy with the visual appearance of the design. It’s self-contained and pretty; it’s something I wouldn’t mind having on my desk or on my night stand, which was the goal. It satisfied my desire to try to design an object totally independently of a class. It’s a good accomplishment for me. On the other hand, I would be lying to say it works. The poems will not print properly. It’s something I’m really disappointed about. I’m frustrated with my inability to get it working, but I’m reminding myself that I can’t even get it working with Zach’s help, who is far more experienced with Arduino and code than I am. I stand by the concept, but if I could go back and tell four-weeks-younger Elise to just to do audio of reading the poems or something, I would.

 

Block Diagrams

Without the microSD card, presented on the final presentation day

With the microSD card

Schematic Diagram

 

Code (As it appeared on presentation day)

/*elise chapman
self-watering project
weight of a glass of water to printed poetry + a little jingle

pin mapping:
  load cell uses pins 2 and 3, as well as 1 GND and 2 PWR
  thermal printer uses pins 5 and 6, as well as 2 GND and 1 PWR
  speaker uses pin 8

parts of code sourced from:
  Nathan Seidle's HX711 Github Repository: https://github.com/sparkfun/HX711-Load-Cell-Amplifier/tree/master/firmware
  Adafruit's Thermal Printer library
  Arduino Project Hub's "How to make music with an Arduino" refrence
  Great assistance from Prof. Zacharias
*/

  
//load cell
#include "HX711.h"
#define calibration_factor -7050.0 //from the Calibration sketch
#define zero_factor 8421804 //from the Calibration sketch
const int LOADCELL_DOUT_PIN = 2;
const int LOADCELL_SCK_PIN = 3;
HX711 scale;

//printer & poems
#include "poems.h"
#include "Adafruit_Thermal.h"
#include "SoftwareSerial.h"
const int TX_PIN = 6;
const int RX_PIN = 5;
SoftwareSerial mySerial(RX_PIN, TX_PIN);
Adafruit_Thermal printer(&mySerial); 
long pickNum = 0;

//jingle
#include "pitches.h"

void setup() {
  Serial.begin(9600);
  
  //load cell
  scale.begin(LOADCELL_DOUT_PIN, LOADCELL_SCK_PIN);
  scale.set_scale(-700000); //from the Calibration sketch
  scale.set_offset(879228); //Zero out the scale using a previously known zero_factor

  //printer
  pinMode(7, OUTPUT); digitalWrite(7, LOW);
  mySerial.begin(19200);  // Initialize SoftwareSerial
  printer.begin();  // Init printer (same regardless of serial type)
  printer.setFont('A');
  printer.setSize('S');

  //jingle
  pinMode (8, OUTPUT);
}

void loop() {
  //load cell
  Serial.print("Reading: ");
  Serial.print(scale.get_units(), 1); //scale.get_units() returns a float
  Serial.print(" lbs");
  Serial.println();

  //poem and printer
  if (scale.get_units()<=0.3) {
    printer.println(poems[0]);
  }
  else if (scale.get_units()>0.3 && scale.get_units()<=0.4) {
    pickNum=random(1,3);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>0.4 && scale.get_units()<=0.5) {
    pickNum=random(3,6);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>0.5 && scale.get_units()<=0.6) {
    pickNum=random(6,8);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>0.6 && scale.get_units()<=0.7) {
    pickNum=random(8,10);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>0.7 && scale.get_units()<=0.8) {
    pickNum=random(10,13);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>0.8 && scale.get_units()<=0.9) {
    pickNum=random(13,17);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>0.9 && scale.get_units()<=1) {
    pickNum=random(17,20);
    printer.println(poems[pickNum]);
  }
  else if (scale.get_units()>1) {
    pickNum=random(20,22);
    printer.println(poems[pickNum]);
  }
  
  //jingle
  tone(8,NOTE_F5);
  delay(250);
  tone(8,NOTE_G5);
  delay(250);
  tone(8,NOTE_A5);
  delay(250);
  tone(8,NOTE_G5);
  delay(250);
  tone(8,NOTE_A5);
  delay(250);
  tone(8,NOTE_F5);
  delay(400);
  tone(8,END);

//  //waits for 1 min before measuring the water again
//  delay(60000);
  delay(20000);
}

pitches.h

#define NOTE_B0 31

#define NOTE_C1 33

#define NOTE_CS1 35

#define NOTE_D1 37

#define NOTE_DS1 39

#define NOTE_E1 41

#define NOTE_F1 44

#define NOTE_FS1 46

#define NOTE_G1 49

#define NOTE_GS1 52

#define NOTE_A1 55

#define NOTE_AS1 58

#define NOTE_B1 62

#define NOTE_C2 65

#define NOTE_CS2 69

#define NOTE_D2 73

#define NOTE_DS2 78

#define NOTE_E2 82

#define NOTE_F2 87

#define NOTE_FS2 93

#define NOTE_G2 98

#define NOTE_GS2 104

#define NOTE_A2 110

#define NOTE_AS2 117

#define NOTE_B2 123

#define NOTE_C3 131

#define NOTE_CS3 139

#define NOTE_D3 147

#define NOTE_DS3 156

#define NOTE_E3 165

#define NOTE_F3 175

#define NOTE_FS3 185

#define NOTE_G3 196

#define NOTE_GS3 208

#define NOTE_A3 220

#define NOTE_AS3 233

#define NOTE_B3 247

#define NOTE_C4 262

#define NOTE_CS4 277

#define NOTE_D4 294

#define NOTE_DS4 311

#define NOTE_E4 330

#define NOTE_F4 349

#define NOTE_FS4 370

#define NOTE_G4 392

#define NOTE_GS4 415

#define NOTE_A4 440

#define NOTE_AS4 466

#define NOTE_B4 494

#define NOTE_C5 523

#define NOTE_CS5 554

#define NOTE_D5 587

#define NOTE_DS5 622

#define NOTE_E5 659

#define NOTE_F5 698

#define NOTE_FS5 740

#define NOTE_G5 784

#define NOTE_GS5 831

#define NOTE_A5 880

#define NOTE_AS5 932

#define NOTE_B5 988

#define NOTE_C6 1047

#define NOTE_CS6 1109

#define NOTE_D6 1175

#define NOTE_DS6 1245

#define NOTE_E6 1319

#define NOTE_F6 1397

#define NOTE_FS6 1480

#define NOTE_G6 1568

#define NOTE_GS6 1661

#define NOTE_A6 1760

#define NOTE_AS6 1865

#define NOTE_B6 1976

#define NOTE_C7 2093

#define NOTE_CS7 2217

#define NOTE_D7 2349

#define NOTE_DS7 2489

#define NOTE_E7 2637

#define NOTE_F7 2794

#define NOTE_FS7 2960

#define NOTE_G7 3136

#define NOTE_GS7 3322

#define NOTE_A7 3520

#define NOTE_AS7 3729

#define NOTE_B7 3951

#define NOTE_C8 4186

#define NOTE_CS8 4435

#define NOTE_D8 4699

#define NOTE_DS8 4978

#define END -1

poems.h

const char poems[][100] PROGMEM = {"Pour yourself something! Hydrate yourself :)",
                                "The Short Ones\nAtticus\n\nSometimes\nthe short poems\nare the hardest\nto write\nchange one word\n and the whole poem\navocados.",
                                "The Tiger\nNael,age 6\n\nThe tiger\nHe destroyed his cage\nYes\nYES\nThe tiger is out",
                                "Trees\nJoyce Kilmer\n\nI think that I shall never see\nA poem lovely as a tree.\nA tree whose hungry mouth is prest\nAgainst the earth’s sweet flowing breast;\nA tree that looks at God all day,\nAnd lifts her leafy arms to pray;\nA tree that may in summer wear\nA nest of robins in her hair;\nUpon whose bosom snow has lain;\nWho intimately lives with rain.\nPoems are made by fools like me,\nBut only God can make a tree.",
                                "A Fairy Song\nWilliam Shakespeare\n\nOver hill, over dale,\nThorough bush, thorough brier,\nOver park, over pale,\nThorough flood, thorough fire!\nI do wander everywhere,\nSwifter than the moon’s sphere;\nAnd I serve the Fairy Queen,\nTo dew her orbs upon the green;\nThe cowslips tall her pensioners be;\nIn their gold coats spots you see;\nThose be rubies, fairy favours;\nIn those freckles live their savours;\nI must go seek some dewdrops here,\nAnd hang a pearl in every cowslip’s ear.",
                                "Warning\nJenny Joseph\n\nWhen I am an old woman I shall wear purple\nWith a red hat which doesn’t go, and doesn’t suit me.\nAnd I shall spend my pension on brandy and summer gloves\nAnd satin sandals, and say we’ve no money for butter.\nI shall sit down on the pavement when I’m tired\nAnd gobble up samples in shops and press alarm bells\nAnd run my stick along the public railings\nAnd make up for the sobriety of my youth.\nI shall go out in my slippers in the rain\nAnd pick flowers in other people’s gardens\nAnd learn to spit.\nYou can wear terrible shirts and grow more fat\nAnd eat three pounds of sausages at a go\nOr only bread and pickle for a week\nAnd hoard pens and pencils and beermats and things in boxes.\nBut now we must have clothes that keep us dry\nAnd pay our rent and not swear in the street\nAnd set a good example for the children.\nWe must have friends to dinner and read the papers.\nBut maybe I ought to practice a little now?\nSo people who know me are not too shocked and surprised\nWhen suddenly I am old, and start to wear purple.",
                                "On the Ning Nang Nong\nSpike Milligan\n\nOn the Ning Nang Nong\nWhere the Cows go Bong!\nand the monkeys all say BOO!\nThere’s a Nong Nang Ning\nWhere the trees go Ping!\nAnd the tea pots jibber jabber joo.\nOn the Nong Ning Nang\nAll the mice go Clang\nAnd you just can’t catch ’em when they do!\nSo its Ning Nang Nong\nCows go Bong!\nNong Nang Ning\nTrees go ping\nNong Ning Nang\nThe mice go Clang\nWhat a noisy place to belong\nis the Ning Nang Ning Nang Nong!!",
                                "Lines on the Antiquity of Microbes\nStrickland Gillian\n\nAdam.\nHad ’em",
                                "Love After Love\nDerek Walcott\n\nThe time comes when, with elation\nyou will greet yourself arriving\nat your own door, in your own mirror\nand each will smile at\nthe other’s welcome,\nand say, sit here. Eat.\nYou will love again the stranger who was your self.\nGive wine. Give bread. Give back your heart\nto itself, to the stranger who has loved you\nall your life, whom you ignored\nfor another, who knows you by heart.\nTake down the love letters from the bookshelf,\nthe photographs, the desperate notes,\npeel your own image from the mirror.\nSit. Feast on your life.",
                                "How to Get There\nLeunig\n\nGo to the end of the path until you get to the gate.\nGo through the gate and head straight out towards the horizon.\nKeep going towards the horizon.\nSit down and have a rest every now and again,\nBut keep on going, just keep on with it.\nKeep on going as far as you can.\nThat’s how you get there.",
                                "Risk\nAnais Nin\n\nAnd then the day came,\nwhen the risk\nto remain tight\nin a bud\nwas more painful\nthan the risk\nit took\nto blossom.",
                                "Autumn\nT.E.Hulme\n\nA touch of cold in the Autumn night –\nI walked abroad,\nAnd saw the ruddy moon lean over a hedge\nLike a red-faced farmer.\nI did not stop to speak, but nodded,\nAnd round about were the wistful stars\nWith white faces like town children.",
                                "My life has been the poem I would have writ\nHenry David Thoreau\n\nMy life has been the poem I would have writ\nBut I could not both live and utter it.",
                                "Believe This\nWilhelmina Stitch\n\nYou’re winning. You simply cannot fail.\nThe only obstacle is doubt;\nThere’s not a hill you cannot scale\nOnce fear is put to rout.\nDon’t think defeat, don’t talk defeat,\nThe word will rob you of your strength.\n“I will succeed,” This phrase repeat\nThroughout the journey’s length.",
                                "The Shortest And Sweetest of Songs\nGeorge MacDonald\n\nCome\nHome.",
                                "The Duck Poem\nOgden Nash\n\nBehold the duck.\nIt does not cluck.\nA cluck it lacks.\nIt quacks.\nIt is specially fond\nOf a puddle or pond.\nWhen it dines or sups,\nIt bottoms ups.",
                                "147\nSappho, trans. Anne Carson\n\nsomeone will remember us\nI say\neven in another time",
                                "24A\nSappho, trans. Anne Carson\n\nyou will remember\nfor we in our youth\ndid these things\nyes many and beautiful things",
                                "The Old Pond\nMatsuo Bashō\n\nAn old silent pond\nA frog jumps into the pond—\nSplash! Silence again.",
                                "Lighting One Candle\nYosa Buson\n\nThe light of a candle\nIs transferred to another candle—\nSpring twilight",
                                "Spring Ocean\nYosa Buson\n\nSpring ocean\nSwaying gently\nAll day long.",
                                "The West Wind Whispered\nR.M. Hansard\n\nThe west wind whispered,\nAnd touched the eyelids of spring:\nHer eyes, Primroses."
                               };

 

]]>
Corrective Lens Tracker and Holder https://courses.ideate.cmu.edu/60-223/f2022/work/corrective-lens-tracker-and-holder/ Thu, 27 Oct 2022 18:39:21 +0000 https://courses.ideate.cmu.edu/60-223/f2022/work/?p=16660 By Sharon Li

The corrective lenses tracker and holder allow an individual to store their contact lenses and glasses in the two separate compartments in addition to serving as a tracking device for how long weekly contact lenses have been opened.

Demonstration of the corrective lens tracker and holder in demo mode where 24 hours is converted to 10 seconds and 5-day use contact lens. It is also sped up 4x.

Overview Photo

Inside the main compartment of the box where you store your contact lens, there is an IR sensor on the side that detects whether or not the lid is closed, which is used to update the LED display.

In this box, there is a false floor that hides all of the hardware underneath so that the user cannot see it.

To the right of the box, there is a side compartment whose intention is to hold or store glasses when not in use.

A user can open the lid to update the contact lens tracker information on the LED display automatically if it hasn’t already been updated that day. This tracker would update automatically at 11:59 pm even if the user did not put their contacts in that day.

Hypothetically if the user were to lose their contacts or for whatever reason lose their pair of contacts and need to open a new pack, the user can hold down the button for 3 seconds to reset the “life” of the contacts tracker. This 3-second hold will also apply when it is time to change contacts.

Process Images and Review

During the beginning stages of developing my idea, I had a lot of components and functions on this box that didn’t provide any supplementary support but rather were just extra embellishments for design. As a result, I decided to simplify my box more for easier use and decided to add an extra side component to store my glasses since I have more than one corrective lens. I reduced the number of buttons to one for a much cleaner look and to have the display on the entire time. Furthermore, I was able to add additional parts to my box for better functionality, such as a real-time clock that can track the days as well as an IR sensor to track if the lid was closed or not. 

Initial ideation sketches of this tracker on the top and holder with a further and more finalized design of my sketch on the bottom after consulting with Professor Zacharias.

Following this, Professor Zacharias was able to help me with the CAD design of this box for the final product and all of its features. Specifically, I was able to construct the false floor of my project so that it could hide all of the hardware, and users would be able to operate this box seamlessly and would not see any of the wires or the Arduino. It was difficult to decide on the dimensions of this box without knowing how I was able to fit all of the hardware inside the false floor. Therefore, I had to estimate and allot more space than needed in the case where it might not fit entirely. 

Final CAD design of the corrective lens holder and tracker with the help of Professor Zacharias.

While assembling the product after laser cutting the plywood using the CAD design, it was arduous to make sure that the code was working well. Meaning that I had to figure out the logistics of how I wanted this box to operate. Specifically, how I wanted the lid to function and how it would work in terms of tracking the days. As a result, I decided that the box would update itself at the end of the day on a daily basis even if the contacts were not used because my contacts are for a 7-day use regardless if I use them every day or not. I also had to do research here since initially, I did not know it was only for 7-day use, so I had to figure out a way to accommodate the fact that it can only be used for this certain time period. To solve this, I decided on the box updating automatically every night. By opening the lid, it would update automatically so users can see the most recent number of how many days the contacts have been opened. 

Assembling and wiring the hardware of my corrective lens holder as well as setting up all the components individually and then eventually altogether.

Building and attaching the laser-cut pieces of plywood together and mounting the LED display as well as the button to the final prototype.

While assembling the product after laser cutting the plywood using the CAD design, it was arduous to make sure that the code was working well. Meaning that I had to figure out the logistics of how I wanted this box to operate. Specifically, how I wanted the lid to function and how it would work in terms of tracking the days. As a result, I decided that the box would update itself at the end of the day on a daily basis even if the contacts were not used because my contacts are for a 7-day use regardless if I use them every day or not. I also had to do research here since initially, I did not know it was only for 7-day use, so I had to figure out a way to accommodate the fact that it can only be used for this certain time period. To solve this, I decided on the box updating automatically every night. By opening the lid, it would update automatically so users can see the most recent number of how many days the contacts have been opened. 

Testing the IR sensor for functionality in conjunction with the LED display and that it would update simultaneously upon “opening” and “closing” the lid, which in this case was my hand.

During this process, I encountered many difficulties and problems when writing the code, as it would not show up on the display or the timing would not match up, and it would continue to count even after 7 days. To solve this problem, I added many booleans to track all the different variables that I didn’t expect to consider earlier.

A snippet of my demo code used for the final prototype where many booleans had to be considered in order for the box to update based on certain facts and the LED display would correctly display as well.

Discussion

Despite my initial concern throughout the ideation process that I would not be able to brainstorm a viable product that would ultimately be useful for me, I am proud of my progress throughout this process. Especially seeing my project literally come out of the lines in my sketches into an actual and tangible product is lively. I satisfied my goals for this project since I was able to target a reoccurring problem I have with keeping track of my contacts and finding a place to store my glasses, so it’s like hitting two birds with one stone. It was super helpful to receive feedback from Professor Zacharias and my peers during the ideation process, and it helped me look at my project from a different perspective.

However, some improvements could be made to my project after seeing the final product. Specifically, during the final critique where my peer commented, “Portability if you want to bring it when you travel? Like is it possible to make it a bit flatter? Do you need such a big screen?”. I agree with this comment, as it was one of my biggest concerns while prototyping and figuring out the measurements. Specifically, it was difficult to estimate the space needed for the hardware. I would have loved to make it smaller so it would not take up as much space in one’s room, especially since the contact lens case is usually the size of an eraser. Another comment I agreed with was, “ What if you have contacts that are monthly/yearly? How easy would it be to change it?”. Adding more features to my box would have been helpful and added to its functionality. Since this box was tailored to me, I did not need to think about the lens past the seven-day use. However, it could be helpful to have a feature that could manually change the days the contacts can be used or, more generally, track any item that could be used.

After this project, I would definitely build another iteration of this project that heavily revolves around the concerns I had that were directly related to the critiques I received. For example, I would make the box smaller and more compact by using another microcontroller and a smaller LCD display. In addition, I would definitely try to add more to the programming side and allow users to change the days, amongst other things. Overall, I learned that I had difficulty coding this project, specifically integrating a time aspect into it. Furthermore, I enjoyed the hands-on aspect of building the project and laser cutting it, particularly the ideation portion of coming up with this idea and sketching it.

Functional Block Diagram and Schematic

Code

/*
  Corrective Lens Tracker and Holder
  Sharon Li

  The code below keeps track of the duration of time a pack of contact lens has
  been open for using real-time data information from a real time clock that is
  displayed on a LCD display. This tracker will update once every day either
  manually by the user when they open the lid of the box for the first time that
  day or automatically at 11:59PM. After 7 days (the maximum number of days a pair
  of contacts can be used for), the LCD display will warn users to open a new
  pair of contacts.

  The code below additionally includes a demo mode where 24 hours is converted to
  10 seconds.

  Pin Map:
  Pin | role | Description
  ----------------------------------------
  2 | input | Arcade Push Button input
  A0 | input | IR Proximity Sensor input
  SCL | input | SCL pin on Real Time Clock (DS3231)
  SDA | input | SDA pin on Real Time Clock (DS3231)
  A4 | input | SCL pin on I2C LCD display
  A5 | input | SDA pin on I2C LCD display
  5V | output| VL53L0X sensor power
  GND | input | Ground for all components

  Code Credit:
  RTC DS3231 code sourced from the DS3231 library example code.
  IR Proximity Sensor code sourced from the NewPing library example code.
  isLeapYear and changing from date format to day of the year code (Lines 65-66 & 102-109)
  sourced from https://stackoverflow.com/questions/19110675/calculating-day-of-year-from-date.

*/


#include <Wire.h>
#include <LiquidCrystal_I2C.h>
LiquidCrystal_I2C screen(0x27, 20, 4);

#include <DS3231.h>
DS3231  rtc(SDA, SCL);

const int IRSENSOR = A0;
const int BUTTONPIN = 2;

int contactsLife = 0;

bool lidReset = true;
bool dayUpdated = false;
bool oldContacts = false;

unsigned long buttonStart;
unsigned long elapsedTime;
unsigned long regTimer;
unsigned long timeChecker;
int resetButton;

Time  t;

int month;
int year;
int day;

bool isLeapYear(int year) {
  return year % 4 == 0 && (year % 100 != 0 || year % 400 == 0);
}

void setup() {
  pinMode(IRSENSOR, INPUT);
  pinMode(BUTTONPIN, INPUT);
  Serial.begin(9600);

  screen.init();
  screen.backlight();

  rtc.begin();

  ////   The following lines can be uncommented to set the date and time
  //    rtc.setDOW(TUESDAY);     // Set Day-of-Week to SUNDAY
  //    rtc.setTime(9, 49, 00);     // Set the time to 12:00:00 (24hr format)
  //    rtc.setDate(, 11, 2022);   // Set the date to January 1st, 2014

}

void loop() {
  /******************************************************************************************************
      Code for updating the tracker using the IR Proximity Sensor or automatically and 7-day warning
  ******************************************************************************************************/
  //  Get data from the IR Proximity Sensor
  int lidSensor;
  lidSensor = analogRead(IRSENSOR);
  delay(1000);

  // Get data from the DS3231
  t = rtc.getTime();
  month = t.mon;
  day = t.date;
  year = t.year;

  // 2d table mapping the days of the year to each month
  int daysToMonth[2][12] =
  {
    { 0, 31, 59, 90, 120, 151, 181, 212, 243, 273, 304, 334 },
    { 0, 31, 60, 91, 121, 152, 182, 213, 244, 274, 305, 335 },
  };

  // function that converts the date format to the day of the year
  int dayOfYear = daysToMonth[isLeapYear(year) ? 1 : 0][month - 1] + day;

  int contactsStart; //the constant date when the contacts are opened
  int startChecker = true; // if the start date is stored or not
  int contactsCurr = 0; //the most recent checked date
  int dayChecker; //checks what day it is

  if (startChecker) { // if it is the start date
    contactsStart = dayOfYear; // sets the start date to the current date
    startChecker = false;
  }

  if (lidSensor <= 50 and lidReset and not dayUpdated and not oldContacts) { // adds one day if lid opened
    lidReset = false;
    contactsLife = contactsLife + 1;
    dayUpdated = true;
  }

  if (dayUpdated == false and contactsCurr - dayChecker >= 1 and not oldContacts) { // adds one day automatically
    contactsLife = contactsLife + 1;
    dayUpdated = true;
  }

  if (contactsCurr - dayChecker >= 1 and not oldContacts) { // checks if it has been one day
    dayUpdated = false;
    timeChecker = regTimer;
  }

  ////   The following lines can be uncommented to start DEMO MODE and comment lines 111-135
  //  regTimer = millis();
  // if (lidSensor <= 50 and lidReset and not dayUpdated and not oldContacts) { // adds one day if lid opened
  //    lidReset = false;
  //    contactsLife = contactsLife + 1;
  //    dayUpdated = true;
  //  }
  //
  //  if (dayUpdated == false and regTimer - timeChecker > 9000 and not oldContacts) { // adds one day automatically
  //    contactsLife = contactsLife + 1;
  //    dayUpdated = true;
  //  }
  //
  //  if (regTimer - timeChecker >= 10000 and not oldContacts) { // checks if it has been 10 secs
  //    dayUpdated = false;
  //    timeChecker = regTimer;
  //  }

  if (lidSensor > 50 and not lidReset and not oldContacts) { // closed lid
    lidReset = true;
  }


  if (contactsLife >= 7) { // show warning message after it has been 7 days
    oldContacts = true;
    screen.clear();
    screen.home();
    screen.print("Time For New Contacts!");

    while (resetButton == LOW) { // if the button is pressed to reset, resets the tracker
      resetButton = digitalRead(BUTTONPIN);
    }

    contactsLife = 0;
    oldContacts = false;
    startChecker = true;
    delay(1000);
    resetButton = LOW;
  }

  /******************************************************************************************************
       Code for arcade button functions and LCD display resets
  ******************************************************************************************************/

  resetButton = digitalRead(BUTTONPIN);

  Serial.println(resetButton);

  if (resetButton == HIGH and buttonStart == 0) { // starts timer when button is pressed
    buttonStart = millis();
  }

  if (resetButton == HIGH and buttonStart != 0) { // pressed button and timer is on
    resetPrint();
    elapsedTime = millis();
    if (elapsedTime - buttonStart >= 3000) {
      buttonStart = 0;
      contactsLife = 0;
      oldContacts = false;
      screen.clear();
      defaultPrint();
    } else {
      defaultPrint();
    }
  }

  if (resetButton == LOW) { // when button is not pressed
    defaultPrint();
    clearPrint();
  }
}

// function that displays the reset screen on the LCD display
void resetPrint() {
  screen.setCursor(0, 2);
  screen.print(String("Hold for 3 seconds"));
  screen.setCursor(0, 3);
  screen.print(String("to reset"));
}

// function that displays the default screen on the LCD display
void defaultPrint() {
  screen.home();
  screen.print(String(" Contacts Life Span:"));
  screen.setCursor(7, 1);
  screen.print(contactsLife + String(" day(s)"));
}

// function that clears the screen on the LCD display due to some bugs
void clearPrint() {
  screen.setCursor(0, 2);
  screen.print(String("                  "));
  screen.setCursor(0, 3);
  screen.print(String("        "));
}

 

 

]]>