Intro to Physical Computing: Student Work Spring 2021 https://courses.ideate.cmu.edu/60-223/s2021/work Intro to Physical Computing: Student Work Sun, 23 May 2021 15:55:55 +0000 en-US hourly 1 https://wordpress.org/?v=5.6.13 Joystick Actuated TV Remote by The Firs: Final Documentation https://courses.ideate.cmu.edu/60-223/s2021/work/joystick-actuated-tv-remote-by-the-firs-final-documentation/ Mon, 17 May 2021 14:28:30 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13435 1. Introduction:

In this project, each group in our class was assigned to work with a client with a physical disability to create a prototype of a product that could potentially improve some aspect of their daily life. Our group worked with Amy, a woman who is paralyzed from the shoulders down with the goal of creating a TV remote that utilizes a method of interaction she is able to comfortably control for a greater degree of autonomy. Here’s a link to our previous interview documentation.

2. What we built:

Our project is a TV remote that uses a joystick as the method of input rather than the traditional grid of small and soft buttons. First, we narrowed down the essential functions of the TV to a power button, volume up/down, and channel up/down. We mapped the volume up/down to the cardinal up/down directions on the joystick then mapped channel up/down to left/right on the joystick. We also enlarged the size of the power button to make it easier to push. These controls are processed and converted into IR wavelengths that matches that of Amy’s TV and broadcasted to the TV through an array of IR LEDs.

Final Product: Joystick Handle

Final Product: TV Control Box

 

 

A view of the inside of the box, specifically showing the LED configuration (4 super bright IR LEDs in parallel with each other, each with 1000 ohms, and connected to the same MOSFET that connects to a pin on the Arduino.

 

A top-view of the control box. The LEDs corresponding to different instructions can be seen here. These indicator LEDs show Amy what button was actually pressed, to make sure she knows whether she actually pressed the power button or accidentally changed the channel instead of the volume, in case her muscles don’t do exactly as she desires.

Unfortunately, we were not able to attach the joystick handle to the wiring of the final product because the parts for the joystick itself arrived too late. Ideally, we would quickly drill a hole in the bottom of the grip that fit the diameter of the joystick and find the optimal hole depth to provide stability but not bring the grip too close to the surface of the box.

Narrative

Amy gets home and is bored of browsing Facebook, so she decides to watch TV! Luckily, her new joystick actuated TV remote is here for her! Her father mounts it to her lapboard and she can use her TV without assistance! After watching a movie, she decides she wants to switch the channel to watch HGTV and moves the joystick to change the channel. Unfortunately the host’s voice on HGTV is quiet, but she can fix that by using the joystick to turn up the volume! Eventually she decides to stop watching television, so instead of having to ask for assistance like she normally would, Amy simply presses the power button and can enjoy the rest of her day.

3. How we got here (prototype and process):

Part 1: Joystick Form Development (Daniel Zhu)

Question: How can we optimize the form of the joystick itself to make it comfortable and easy for Amy to use?

During the interview we learned that Amy had limited movement in her arms due to muscle atrophy that she had developed over the years after her accident due to bedrest. To begin the prototyping process, I started by brainstorming a number of different shapes that focused of limiting the amount of work that Amy would need to do with different parts of her hand.

Each of the forms in these sketches focused on methods of gripping the joystick. At this point, I was thinking about how Amy’s posture sitting or laying down on the bed would affect her posture and the muscles she would use to move her arms, wrist, and hand against gravity and what support she would have in either situation from the bed or wheelchair. Eventually, I narrowed down these sketches to three broad approaches based on the part or arm or hand it demanded the most fine motor movement from. My plan was to present each of these approaches to Amy during our feedback session and further develop the one that she felt would best fit her needs.

From left to right, the prototypes emphasized movement in the: fingers with the hand and palm static, the thumb and forefinger with everything else static, and the hand static with the wrist or arm providing movement. I tried to target parts of the arm and hand with a decreasing level of fine motor control required from left to right. I carved each of these prototypes from block of foam I cut in half then composited with wood glue. Detailed images of how each model is held are shown below:

I also did additional rough exploration in clay to get an idea of how each grip would feel in hand before I fully fleshed out the idea in foam. I roughly shaped the clay into a shape approximating the grip to see how it would fit into my hand. The picture below shows one of such models I created to help me narrow down the forms of the three foam models I eventually made.

Prototype 2:

Question: How large should the enclosure be and how should we lay out the input devices to be most easy to use for Amy? In addition, how well would a scroll wheel input work versus a joystick?

 

 

Prototype Enclosure

For the enclosure prototype, we used a cardboard box with mock scroll wheels and a power button to indicate what the layout would be for Amy so that we could get feedback on the design. During the prototype meeting, we got the valuable feedback that we needed about a hand-width between input devices to ensure that Amy wouldn’t accidentally press any buttons. We also decided against the scroll wheel as it would have required the device to be much wider since the input devices need to be very far apart, as well as more technically difficult to implement.

 

Prototype 3:

This prototype was designed to investigate how the IR LEDs should be designed to communicate between the box and the TV.

 

The first configuration here, with one IR LED pointing directly to the TV. I was able to test with an IR receiver that it was accurately positioned and had some leeway even if this apparatus got shifted slightly left, right, or backwards.

 

This is the second configuration. These would all be IR LEDs, powered by a MOSFET (this configuration was actually made first, so I had moved the MOSFET and actual IR LED to the other configuration, so it’s not featured in this picture).

My prototype involved creating two mini configurations and comparing them. In the first one, the LEDs are aligned in a semi-circle, and would be stationary in front of the TV so that the IR signal would be aimed directly at the TV receiver at all times. In the second one, the LEDs are in a full circle and represent surrounding/moving with the joystick.

 

While designing the first configuration, we thought we could use a radio transmitter to get the signal from Amy to the IR LED, which would then translate and transmit the requested instruction to the TV, making sure it always hits the receiver. Since only 1 LED needs to be pointed to the TV, the other LEDs could be used to make sure the right request (power, channel up/down, volume up/down) was received by the IR, by flashing a corresponding color. In the second configuration, the IR LEDs are in a full-circle, to represent being attached to the joystick (shown in the ideation sketch). If Amy was moving around and the chair with the control box was facing different directions in the room, having the IR LEDs move with the joystick, and many of them, would make it more likely that the signal gets to the TV no matter what direction she’s facing. Since I only had one super bright IR LED, I had to make observations by using colored LEDs and seeing if there was any space where the color didn’t reach on the perimeter of my configuration when I moved the breadboard around. It would have been more helpful to be able to test this by using actual super bright IR LEDs and seeing if my IR receiver got the signal, and got the right signal, when the breadboard was being moved at different angles.

An ideation sketch for the first configuration. It would require two Arduinos, and also the radio module, if we were to pursue it. If were to use an IR receiver instead (another thing I considered), that would be stupid, because then we might as well send the IR signal directly to the TV, whereas a radio signal might be able to go further from any angle Amy faces.

This sketch better shows how the LEDs would fit onto the joystick and move with the joystick. It would be more beneficial than using radio to get the instruction to an IR LED because two Arduinos would not be needed/supplied with power, making it easier to take care of in case something malfunctioned on either end of the sender/receiver in the previous configuration.

 

 

Simple code to make the IR led send a signal on demand so I could see how far it would transmit from. I also used an IR receiver to see the signal it was receiving. This code was adapting from the example IRLib2 code for “send.” I didn’t think much about whether we were using SONY or NEC at the time, so I experimented with both signals to see if they did anything different other than sending different numbers (they didn’t, although the Sony code seemed be received by the receiver more times than the NEC code).

  • Your findings from the prototyping process. In essence—what was/were the answer(s) to your question(s)? Or if you didn’t find answers to your questions, then say so and explain what didn’t go to plan.

Although I couldn’t fully experiment with the IR LED configurations given that I only had one LED, by observing where the light falls and mimicking how the device would be used by rotating the board, I was able to come up with some conclusions. I observed that there was a tradeoff between consistency of the IR signal being received by the TV and from what angles Amy would be able to send the signals, as the second configuration’s transmitting signals wouldn’t necessarily always hit the receiver from *any* angle, but it allowed for more movement than if you only had one IR LED that always needed to face front. If we used the radio module with configuration 1, then perhaps this could be solved such that there’s both accuracy and range of motion, but the need for two Arduinos would be quite annoying for the user, especially if one dies or starts malfunctioning. I found this to be a problem throughout this processes when trying to emit signals from one Arduino and receive them from another, where sometimes the program would just stop running after a while. Also, the LED could send the exact same signal everytime, but if the IR receiver was facing different angles, it would sometimes receive different codes, which was odd. However, after speaking with Amy and asking her about how she moves around her, we learned that she’s pretty much always facing in the general direction of the TV when she’s in her wheelchair. Therefore, the IR LEDs don’t need to span a whole range of space and be in a circular configuration. Rather, if several of them face one general direction (forward), the signal should accurately reach the TV, without need for the radio module/second Arduino since we would be using super bright IR LEDs. Since Amy would be using a lap desk to place the TV controller on top of, the LEDs really wouldn’t need to be in a special configuration, as long as they could emit through the box. One type of positive feedback was the use of indicator lights as in configuration one. We decided to invoke this aspect in our final product, which also helped debug our final product a lot since we integrated the corresponding LEDs/colors while working and building the circuits.

 

Note: the configurations were disassembled and LEDs/resistors/Arduinos were reused in the final product and I didn’t save a video, but all that happened was the LEDs flashed when I typed in a letter to the Serial port. I also just moved around the breadboard as if it was on a joystick and made qualitative observations as best as I could, but you can’t see anything happening in that either.

 

Prototype to Process: Joystick (Daniel)

With these prototypes completed, we went in for the feedback session with Amy. During this feedback session, we learned that Amy not only had atrophied muscles but also was unable to move or feel anything below the elbow. On the one hand, this was a pretty big shock because we had somehow failed to learn this pretty critical factor during our first meeting. However, it also presented a good learning opportunity to correct that mistake by making sure we had clear picture of what she could and could not do for the final prototype. Luckily, while the first two prototypes were unusable, the prototype that involved keeping the hand static was similar to a product she already used and could be further developed. During the feedback session, Amy shared a link to an ergonomic joystick she already used for her wheelchair that worked well for her. Because this already was confirmed to work well for her, and we wouldn’t be able to test other designs with her in person before the final model, we decided to base the final joystick design off of the one she linked. Based on the design of the Panther joystick, I identified the key factors behind what made it work and created a few sketches to carve into a final wooden model:

With the sketches complete and a set of measurements mapped out, I glued a piece of wood split in half together and carved out the final form, checking the ergonomics roughly against my own hand as a went while leaving space for tolerance to approximate how Amy’s hand might fit. I traced outlines of the form on the surface of the block before carving away or cutting with the band saw. Process pictures of the carving process are shown below:

 

Process of TV Box (Nish/Kevin)

We had several iterations of our enclosure, using the prototype enclosure as a baseline. The second iteration of our enclosure (above) was a learning experience for us. We thought the prototype was a little bulky, but after having the second iteration in hand, we realized it would be too small to meet the design requirements that inputs should be far enough apart that Amy could press buttons or move the joystick without a mistaken input.

Third/Final Enclosure

The final enclosure worked out great! It had sufficient spaces for all the electronics and was we intentionally made it with clear acrylic so that the IR signals could pass through the walls of the enclosure so it could control the TV. Due to the rushed timeline of putting together the box, it is functional but not fine-tuned. In the future, we would adjust the wires so that they are an appropriate length and all soldered into the protoboard, which would be attached to the box securely, and the LEDs would also be soldered into the base. Coverings on the LEDs would make the box more aesthetic. A more opaque material would have been preferred, but we found we needed this level of transparency for the IR LEDs to transmit accurately. Also, we would ideally have  closed back and battery-powered Arduino, so that the box can be moved independently of being plugged into a computer.

 

Here, we are testing the line up of LEDs to make sure the signal goes through our chosen material and that the signal is received correctly.

In this image, we tested the interaction between the button/joystick together, and making sure it lit up the LED when the joytick was pushed to the correct direction as in our code. I ran into some errors in this portion, finding that our joystick was coding oppositely to how we we wanted it to and not lighting up the right portion, and it turns out this was a mistake in the code where we were accidentally always calling the X parameters of the joystick instead of the Y parameters to test for channel up or down. Also, I was short-circuited the button for a while, and this caused the Arduino to malfunction for a while and I thought it was because it couldn’t take both the new joystick and the button at the same time, but that was not the issue. I repeatedly moved where the LED was plugged into, as well, in order to test for the different controls.

Even though this button was quite straightforward, I still tested it out to make sure it was working and that I wired it up correctly (always-on versus always-off), since this button has three prongs and I needed to ensure I was using the right ones. I ran into an issue where it was printing all 1s until held at a certain angle, and it turns out this was due to ground not being plugged in. I tested the button, once I fixed that issue, to make sure it was debounced in our code, and then tested it with the joystick.

Here, I’m testing out the MOSFET with a smaller LED to make sure my wiring is correct and the signal is going through. I started with a smaller MOSFET from the Arduino kit and the regular IR LEDs, switched out and reconfigured the circuit with the bigger MOSFET once the smaller circuit was working. At this time, I wasn’t sure if we would be getting enough super bright LEDs for the project, and was confused as to why the one super bright LED I did have wasn’t working. After a couple of hours, it turns out that as previously described, I had plugged the power into ground :D. After figuring out the error, I then proceeded to add more LEDs (using the super bright ones now) and resistors in parallel, making sure to maintain a low enough overall resistance such that enough current still flows through the LED while also making sure I didn’t blow the LED (although I later learned that it was very unlikely I would blow the LED anyway).

 

What didn’t go to plan

In general, we didn’t run into any unexpected issues we were responsible for. Our circuit design and code seemed to work fine and we budgeted enough time to make several iterations to improve the design of our box. The main issue we ran into was that our parts order took a very long time to ship due to the lithium-ion battery we ordered. This was meant to be a nice feature for our project, as it would be rechargeable with a common USB cable, but because LiPo batteries need to be shipped by ground only all of the parts for our project including the joystick did not arrive until the day before our final presentation and demo. This made every part of our project more difficult as we were unable to test-fit any of our manufactured parts with the ordered parts and because of time constraints could not adapt our ergonomic joystick handle to the joystick. Another small issue we had was because we were unable to test-fit all of the components, the ergonomic joystick handle would have blocked some our indicator LEDs for when we activated volume up/down or channel up/down if we had time to attach it to the joystick as was noted in our final crit feedback. However, the light would still shine through and reflect enough to be seen even underneath the final handle apparatus.

4. Conclusions and lessons learned:

We received lots of feedback about our joystick handle, first and foremost: “The joystick handle seems really ergonomic,” and “The wooden joystick handle also adds a nice natural aesthetic to the whole device!” As seen above, the thought process and creation of the joystick handle was a huge part of our project, and it was a unique experience to model after someone’s discontinued product/something that Amy already owned, and produce that in a different material. A large reason that the handle “seems really ergonomic,” even in wood, was probably the emphasis on not only Amy’s resting position, but also that it requires no pressure from her fingers to pull the joystick back. It would have been really cool to actually get the handle on the controller and see if Amy could properly use it.

Someone brought up that the height of the box + the height of the handle may make it difficult for Amy to actually maneuver in real life if it’s sitting on top of her lap desk: “Does the added height of the box+joystick+handle hurt the ergonomics due to the higher arm position needed to manipulate it? Or would the box be mounted to a lower surface?” Indeed, looking back, Amy’s current joystick is set below her wheelchair so she doesn’t need to move her arm up in order to place it on the joystick. Although we specifically talked with Amy about putting the device on top of her lap desk, this definitely could have been a problem, and to solve it, we may need to create an entire extension that lets the box attach out of the side of her wheelchair so the handle is below her armrest level. In fact, throughout the process, we found that sometimes even items we did discuss with Amy sometimes didn’t tell the full story, or we were missing part of the picture with her disabilities. For example, even during the interview, although we asked directly what her disability was, we didn’t know she was paralyzed from the neck down until about halfway through, as she initially only mentioned the bed sores, which was most recent to her. This is interesting to note when working with people with disabilities, specifically people that have had them for a long time: in the same way we don’t realize small things a disabled person might not be able to do, they also may not realize that regular portions of the way they’ve been running their lives for years is different than us, and may require special consideration. Even when you think you’ve gotten a direct answer, it might not capture the full story or context because details that seem trivial to the client may be left out initially. It’s also not their job in this scenario to think of everything that’s viable, though. Even though Amy made many suggestions on possible ways to have the device, we could have done a better job of analyzing it from her perspective to see if her suggestions would still be viable after the necessary modifications (e.g. putting the device on the lap desk, but our device is much taller than her tracking ball). If we had perhaps done a fake run-through as if we were Amy using the box (if we were able to combine our box together, as neither part individually looked too big), we could have maybe caught something like this earlier in our process. 

It was also noted that the indicator lights were helpful: “Love the light indicators, the little things make a big difference.” I was initially surprised that many people commented on its utility, as it was initially a component that seemed like “something extra” to add to our device. Instead, the feedback helped us realized that these small add-ons that initially seem trivial can actually be quite useful when designing for people with disabilities. It was a good lesson to use something that arose from our own debugging process (our need to see what was being pressed and sent) and see it through to the final product, since normally we may not have seen the need to add something like that because if we make a mistake pressing the wrong button, it’s really easy to just quickly move our fingers and undo our action. However, if every action requires an extreme movement which is not as easy to do (pushing a joystick all the way to the left or right), it’s difficult to know, when you do multiple movements, what was received correctly or not. There was also a concern regarding how the lights and LEDs would work together: “It seems like the handle is blocking the LEDs?” Although it seems like the furthest LED light woulodn’t be able to be seen when the handle was placed on top, we believed the light woud scatter enough that even if something is on top of it (but not touching, as the handle would hover above the light), the light could still be seen as the color would illuminate the bottom of the wood just enough to know if it’s on or off. Perhaps we could have made that furthest LED, on the side of the TV controller closest to the TV and farthest from Amy, a more intense color than white, so the light would show better. Or, we could have altered the box so the lights are all in a row facing Amy on the side closest to her, and still correspond to the same actions

We received several pieces of feedback verbally that were also helpful. For example, several people brought up that Amy’s arm sometimes has spasms, and if there was anything in place to make sure that the device isn’t knocked over. A rubberized mat was suggested, such that the acrylic box could properly sit atop her own lap desk. We were initially thinking about using Velcro for this purpose, but the rubberized bottom would perhaps be better as it allows the mat to be moved from the lap desk if Amy doesn’t want a Velcro attaching patch to always be taking up space. The TV controller would be knocked over if necessary so as to not prevent an emergency, but typically, the rubberized mat would stop it from just slipping around regularly. Further, we also realized we need to make the back panel, if we added it, able to open on a hinge so that someone could open the enclosure to fix anything or change the battery, which we had intended to include.

 

As touched on above, it was a little difficult working remotely with our client on this project. We had trouble initially understanding what her disability was, and it was slightly difficult for her to show us over Zoom precisely because of the disability where someone else would need to angle the laptop camera for us to see anything, which was not always very clear. Most importantly, we were never able to truly gauge Amy’s range of motion and strength to know what she could do in terms of range of motion and also strength. We still don’t know if this is a device she could continue to use for a while or if, even though she might have the range of motion to, she would get tired very quickly. I think we did a good job trying to overcome this by modeling our device after mechanisms she already had, such as the ergonomic design and asking her about her comfort with buttons/joysticks, but it would have been extremely helpful to test a range of motion in our prototype stage.

 

One of the biggest lessons that we learned from the experience was to make sure that we made as few assumptions as possible and made sure we confirmed everything with our client. During the first interview, when we were discussing the physical aspects of Amy’s disability, we followed a line of questioning where we asked about how it impacted her life. Through this part of the interview, we learned a lot about various anecdotes about her daily life, such as her schedule, how she moved around, what she needed help with, and so on. However, this approach ended up not being thorough enough, and we ended up missing a critical detail of her disability: that she was also paralyzed below her elbow. While, the variety of approaches we generated for the joystick handle ended up being broad enough to still develop into a usable product, we learned this major detail much later than we perhaps should have.

 

We also learned to overestimate the size you need in your enclosure, then scale down. We initially thought that the first prototype enclosure was larger than it needed to be, so we took off a few inches on each dimension of the box to save some space. This resulted in our second enclosure iteration being much smaller than it should have been, requiring a complete redesign.

5. Technical details:

Schematic and Block Diagram:

/*
   60-223, Project 3 (Firs): Joystick Actuated TV Remote for Amy
   Daniel Zhu (dszhu), Nish Nilakantan (anilakan), Kevin Bender (kbender1)
   time spent: ~50 hours

   Description: This code is for a project that uses a joystick and large pushbutton to actuate a high-power
   IR LED using an adaptive design met to fit the needs of our client Amy who is mostly paralyzed below her 
   shoulders with limited use of her right hand
   

   Collaboration and sources:
   1) Our joystick design was inspired by the ergojoystick
      Panther
   Pin mapping:

   Arduino pin | type   | description
   ------------|--------|-------------
   A0           Input     Joystick X-axis pin
   A1           Input     Joystick Y-axis pin
   3            Output    IR LED Output
   4            Input     Power Button Input
   5            Output    Volume Up Indicator LED
   6            Output    Volume Down Indicator LED
   7            Output    Channel Up Indicator LED
   8            Output    Channel Down Indicator LED
   9            Output    Power Button Indicator LED
*/
#include <IRLibAll.h>

// Pin mappings (changeme)
const int JOYSTICKXPIN = A0;
const int JOYSTICKYPIN = A1;
const int IRLEDS = 3;
const int POWERBUTTON = 4;
const int VOLUPLED = 5;
const int VOLDOWNLED = 6;
const int CHANUPLED = 7;
const int CHANDOWNLED = 8;
const int POWERLED = 9;

IRsend mySender;
// ir codes for sony tv
const int POWER = 0xa90;
const int BITS = 12;
const int VOLUP = 0x490;
const int VOLDOWN = 0xc90;
const int CHANUP = 0x90;
const int CHANDOWN = 0x890;

// initialize state variables
int chanUpState = 0;
int chanDownState = 0;
int volUpState = 0;
int volDownState = 0;
int powerState = 0;

//initialize last state variables
int lastChanUpState = 0;
int lastChanDownState = 0;
int lastVolUpState = 0;
int lastVolDownState = 0;
int lastPowerState = 0;

//initialize debouncing timers
unsigned long lastChanUpDebounce = 0;
unsigned long lastChanDownDebounce = 0;
unsigned long lastVolUpDebounce = 0;
unsigned long lastVolDownDebounce = 0;
unsigned long lastPowerDebounce = 0;

const unsigned long DEBOUNCEDELAY = 50;
const unsigned long LEDTIME = 2000;

const int CHANUPTHRESH = -350;
const int CHANDOWNTHRESH = 350;
const int VOLUPTHRESH = 350;
const int VOLDOWNTHRESH = -350;

unsigned long chanUpLEDLimit = 0;
unsigned long chanDownLEDLimit = 0;
unsigned long volUpLEDLimit = 0;
unsigned long volDownLEDLimit = 0;
unsigned long powerLEDLimit = 0;

int getJoystickX() {
  return map(analogRead(JOYSTICKXPIN), 0, 1023, -512, 512);
}

int getJoystickY() {
  return map(analogRead(JOYSTICKYPIN), 0, 1023, -512, 512);
}

int isChanUp() { //less than -512
  return getJoystickX() < CHANUPTHRESH;
}

int isChanDown() { // greater than 512
  return getJoystickX() > CHANDOWNTHRESH;
}

int isVolUp() { // greater than 512
  return getJoystickY() > VOLUPTHRESH;
}

int isVolDown() { // less than -512
  return getJoystickY() < VOLDOWNTHRESH;
}



void setup() {
  // put your setup code here, to run once:
  pinMode(JOYSTICKXPIN, INPUT);
  pinMode(JOYSTICKYPIN, INPUT);
  pinMode(POWERBUTTON, INPUT_PULLUP);

  pinMode(IRLEDS, OUTPUT);
  pinMode(VOLUPLED, OUTPUT);
  pinMode(VOLDOWNLED, OUTPUT);
  pinMode(CHANUPLED, OUTPUT);
  pinMode(CHANDOWNLED, OUTPUT);

  Serial.begin(9600);

}

void loop() {
  // put your main code here, to run repeatedly:
  int power = digitalRead(POWERBUTTON);
  int chanUp = isChanUp();
  int chanDown = isChanDown();
  int volDown = isVolDown();
  int volUp = isVolUp();


  //Serial.println("Y: %d \n X: %d", getJoystickY(), getJoystickX());
  Serial.print("Y: "); Serial.print(getJoystickY()); Serial.print(" X: "); Serial.println(getJoystickX());
  if (lastVolUpState != volUp) {
    lastVolUpDebounce = millis();
  }
  if (lastVolDownState != volDown) {
    lastVolDownDebounce = millis();
  }
  if (lastChanUpState != chanUp) {
    lastChanUpDebounce = millis();
  }
  if (lastChanDownState != chanDown) {
    lastChanDownDebounce = millis();
  }
  if (lastPowerState != power) {
    lastPowerDebounce = millis();
  }




  //check if button state changed and act accordingly
  unsigned long curTime = millis();
  if (curTime - lastVolUpDebounce > DEBOUNCEDELAY) {
    if (volUp != volUpState) {
      volUpState = volUp;
      if (!volUpState) {
        mySender.send(SONY, VOLUP, BITS);
        digitalWrite(VOLUPLED, HIGH);
        volUpLEDLimit = curTime + LEDTIME;
      }
    }
  }
  lastVolUpState = volUp;

  if (curTime - lastVolDownDebounce > DEBOUNCEDELAY) {
    if (volDown != volDownState) {
      volDownState = volDown;
      if (!volDownState) {
        mySender.send(SONY, VOLDOWN, BITS);
        digitalWrite(VOLDOWNLED, HIGH);
        volDownLEDLimit = curTime + LEDTIME;
      }
    }
  }
  lastVolDownState = volDown;

  if (curTime - lastChanUpDebounce > DEBOUNCEDELAY) {
    if (chanUp != chanUpState) {
      chanUpState = chanUp;
      if (!chanUpState) {
        mySender.send(SONY, CHANUP, BITS);
        digitalWrite(CHANUPLED, HIGH);
        chanUpLEDLimit = curTime + LEDTIME;
      }
    }
  }
  lastChanUpState = chanUp;

  if (curTime - lastChanDownDebounce > DEBOUNCEDELAY) {
    if (chanDown != chanDownState) {
      chanDownState = chanDown;
      if (!chanDownState) {
        mySender.send(SONY, CHANDOWN, BITS);
        digitalWrite(CHANDOWNLED, HIGH);
        chanDownLEDLimit = curTime + LEDTIME;
      }
    }
  }
  lastChanDownState = chanDown;

  if (curTime - lastPowerDebounce > DEBOUNCEDELAY) {
    if (power != powerState) {
      powerState = power;
      if (!powerState) {
        mySender.send(SONY, POWER, BITS);
        digitalWrite(POWERLED, HIGH);
        powerLEDLimit = curTime + LEDTIME;
      }
    }
  }
  lastPowerState = power;


  //turn off LED based on timer set
  if (curTime > chanUpLEDLimit) {
    digitalWrite(CHANUPLED, LOW);
  }
  if (curTime > chanDownLEDLimit) {
    digitalWrite(CHANDOWNLED, LOW);
  }
  if (curTime > volUpLEDLimit) {
    digitalWrite(VOLUPLED, LOW);
  }
  if (curTime > volDownLEDLimit) {
    digitalWrite(VOLDOWNLED, LOW);
  }
  if (curTime > powerLEDLimit) {
    digitalWrite(POWERLED, LOW);
  }
}

Solidworks Files

https://drive.google.com/drive/folders/1fAkqj0hCjyJDRPstvaO8WQSeJH48L45u?usp=sharing

]]>
Cutting Board by Maples: Final Documentation https://courses.ideate.cmu.edu/60-223/s2021/work/cutting-board-by-maples-final-documentation/ Fri, 14 May 2021 18:54:44 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13562 For our final project, we interviewed someone who was disabled and designed a device that could help make their life easier. Our client was Jen, someone who had limited use in both her arms, but still wanted to help out with preparing food. However, without the help of her hands to orient and keep the produce in place, it was a very difficult task. As a result, we came together and designed a cutting board that could keep her fruits and vegetables in place as she cut, and could also turn to a different angle, to allow for dicing as well.

For more detail on the interview process and what we discussed, click this link!

The Maples Interview Documentation

What We Built

A cutting board that assists with keeping the produce in place when cutting and turns the board to ease the process of dicing fruits and vegetables.

Final Images

Overview of whole device

Device in use (due to the fact that we used a plastic knife, we are using our hand to activate the capacitive touch sensor)

Video of device working with knife tip

Video of how the braking system works with the solenoid

Detail Images

Using the device to cut fruit

The inside mechanism of device

Pegs holding fruit in place

How the different pieces of the board come together. The black tray is there to catch excess food that fell through the holes.

Narrative Sketch

It’s dinner time! Maple got a new cutting board, and for the first time in years, Maple can comfortably help with food prep. In the past it was a struggle to cut just one vegetable since she can’t use her hands to hold them down. It definitely didn’t help that vegetables were naturally so round. She put a potato on the cutting board and positioned the pegs around it so that it was nice and secure. Then, with the knife in her mouth, she began to cut. *Slice!* She was able to cut the vegetable in a matter of seconds, when in the past it took her minutes. After she was done, she tapped her knife tip to the capacitive touch button, and the board turned 45 degrees. Today, she wanted to dice her potato, so she taped the button again, making a nice 90 degree angle, and she began to cut again. Chop, chop, and then she was done, with enough time to dice another potato. This device made food prep so much easier for Maple, and she’s really grateful for it.

Prototypes

Prototype #1

  • This prototype was made to address how all the different components of the device would come together, as well as to test out different peg types, and if the pegboard would actually be able to hold things still.
  • For my first prototype, I made a cardboard model and created a housing for the mechanics, a space for a tray to catch excess, a variety of acrylic pegboards, and a variety of 3D printed pegs.
  • 3 still images of prototype

    Overview photo of prototype

  • Explosion view of prototype

    Trying out the prototype

  • 1 moving image (video)

    Video of the pieces and how the prototype would turn

  • 3 images of prototype process

    The making of the box and the tray

    Trying out different pegboard layouts

    Testing out the pegs

  • 200 words addressing what you learned
    • From the prototyping process, I learned a couple of things. The first was that for the pegboard layouts, our first design was too compacted, which would have resulted in really small pegs, which would’ve been a choking hazard. As a result, we switched to a wider spacing, but in this design, we found the grid design to be too constraining, so we opted for a scattered peg design, more fitting for different food sizes. Same goes for the elliptical pegs. We started off with 3 different peg designs, and though testing, found that the elliptical design was the most adaptable to different sized objects. Our feedback was pretty positive, but some things that we improved on was limiting the amount of pieces there were. We merged the tray and the piece connecting the pegboard to the mechanism together, making less pieces to keep track of. Some feedback we ignored was the issue about cleaning the pegboard. However, Jen herself didn’t really find an issue with it, and liked the peg design overall, so we continued on with that design. Some surprises I encountered during the prototyping process was just how difficult tolerancing was, especially with wood. The would would expand when it met moisture, messing with how the pegs fit in the holes. Also, the direction you 3D print is important. We printed our pegs vertically at first, making them really prone to snapping. We had to orient it differently to fix that issue.

Prototype #2

  • This prototype was made to address the rotation of the cutting board. Specifically, we were trying to answer the question of how are we going to rotate the cutting board in a reliable manner?
  • The mechanical prototype uses metal balls to allow the top portion to rotate independently from the bottom portion. The metal balls are held in place by a circular path constructed from acrylic. When the user or motor turns the top portion, the cutting board on top rotates while the bottom portion stands stationary.
  • 3 still images of prototype

Demonstration of turntable bearing and friction drive prototype

Ball bearing track to hold ball bearings in place

Slot for electronic wiring to be fed through to the outer shell

  • 1 moving image (video)
  • 3 images of prototype process

Initial sketch of rotation mechanism with friction drive

Gluing technique used to center the ball bearing tracks while gluing them in place

Image of turntable bearing by itself without stepper motor and friction drive

  • 200 words addressing what you learned

Our goal with the mechanical prototype was to determine how we would get the cutting board to rotate and allow the user to cut from many different angles. The scaled bearing design we came up with worked well allowing us to rotate the outer bearing without the inner bearing. The scaled version was able to rotate with little friction and stand up to the downward pressures of cutting. The friction drive, however, did not work as intended. We had initially thought that the friction between electrical tape and acrylic would be enough to move the inner part of the bearing with little trouble. Surprisingly, we could not get enough friction instead seeing a lot of slipping between the motor wheel and the inner part of the bearing. Based on this result, we decided to move forward with a gear design instead to improve the reliability of the force transfer from the stepper motor to rotation of the cutting board.

When demonstrated to Jen for the first time, the rotation concept and implementation was enthusiastically approved. She loved the idea of being able to rotate the vegetable so that she did not have to keep unpinning it and re-pinning it. She did, however, request a food bucket to swipe cut vegetables into in order to declutter the cutting board but this feedback was not implemented into the final product because it was outside the scope of the project.

 

Prototype #3

  • This prototype was built to answer Arduino and electronic questions: What should we use as a break for the rotation device? Do we need more torque for the electronic that rotates the device?
  • This prototype consists of two parts. The first is a stepper motor that rotates a set amount when activated by a touch sensor. The second part is a small solenoid (an electronic with a small rod that can move back and forth) which is also activated by the touch sensor. When the touch sensor is touched, the stepper motor will rotate 45 degrees and the solenoid will activate (kind of like releasing a break) and when the stepper motor finishes, the solenoid will release (kind of like activating a break).
  • 3 still images of prototype

Overall view of prototype. Includes touch sensor, small solenoid, and a small stepper motor.

Everything activated by the touch sensor. Solenoid activates and the stepper motor rotates.

Experimenting with multiple touch sensors and attempting to make the motor rotate both ways.

  • 1 moving image (video)

  • 3 images of prototype process

Using a larger solenoid for the larger device.

First version of solenoid wiring that worked. Later had to change the transistor.

Attempting to solder. Later not included because it mysteriously broke almost everything.

  • 200 words addressing what you learned
    • During the prototype process, the big questions were what did we want to use as a break for the device and how much torque would we need to rotate a lazy-susan and a cutting board. We decided to use a solenoid because the rod in it provides a concrete way of activating a break and releasing a break. We also knew we wanted a stepper motor to rotate the entire device. Since Jen can’t use her arms, we wanted to give her a convenient way to rotate the device, so the stepper motor and the solenoid in the prototype were controlled by a touch sensor. After critique, we knew we had to make everything bigger. We decided to go with a large solenoid which needs 12V to power it. We also went with a larger stepper motor because the original was far too small and couldn’t provide the torque we were looking for. The prototyping process was straightforward, but I had a lot of trouble wiring the larger solenoid. The 12V power supply fried multiple components and I had to research and retry different wiring combinations. It finally worked after we switched the type of transistor.  

Process

The first time we brainstormed the pegboard idea

The making of the final wooden pegboard

Demonstration of technique used for gluing base so that the gears align with each other

Close up of the solenoid braking mechanism used to lock the cutting board in place

Combining the electronics with the rotating device.

Putting the electronics, rotation device, and cutting board together.

The discovery made along the way came in determining the drive we would use to rotate the cutting board. Despite confidence in friction being enough to overcome the weight of the cutting board and the internal friction in the bearing, we found that it was extremely difficult to produce enough friction between the bearing and the motor in order to rotate the cutting board without slipping.

We also discovered that tolerances are everything when designing a braking mechanism. For our design, we initially thought that small holes fitting the diameter of the solenoid exactly would be more than enough to stop the rotation reliably, not taking into account the accuracy of the rotation drive itself. Instead, we found that it was extremely difficult to align the holes properly and thus ended up increasing the size of the holes to allow the solenoid to stop the cutting board more reliably.

Speaking to the process of building itself, we fell a little behind schedule because our expectations of build time were too optimistic. We initially planned to have everything done by the weekend before the project was due, but limitations in availability and materials meant that we had to spend a couple extra days finishing up the final product. We were, however, able to integrate everything on time as we had planned to finish the project 2 days early in order to account for hick ups along the way.

 

Conclusions and Lessons Learned

Working remotely on this project was the most challenging part because of the limited manufacturing and client interaction opportunities. In terms of manufacturing, COVID-19 restrictions on building open times and capacity made it difficult to schedule meetings to build and debug the project together. It forced us to be a lot smarter with our time, designing and debugging as best we could outside of the lab open hours in order to make the most efficient use of time and materials in the lab. In addition to limitations on work schedule, inability to meet Jen in person made it much more difficult to assess the extent of her capabilities and design specifically to her needs. While we were able to get video demonstrations of her capabilities, those were not nearly as clear as being able to view her actions first hand and understand from a first person point of view. Looking back, I think that more communication between us and Jen in the form of meetings, demonstrations, and emails would have given us more insight into things like movement limitations and space limitations so that we could design a gadget that better fit her workflow and space constraints.

The experience of designing for someone with a disability forced us to think more critically about the use of the product. The initial problem definition part in particular was a huge moment of learning as it forced us to think outside of our own experiences. Despite the challenges of doing this type of ideation remotely, we were surprised by the amount of personalization we could incorporate into the functionality of the device. After demonstrating our initial prototype to Jen, she was surprised at our solution as we had identified a problem that she hadn’t even thought about in having a cutting board that rotates. Being able to identify problems that weren’t on the radar of the client was a nice surprise.

Overall, we are extremely proud of the functionality of our end product. The stopping and turning mechanisms worked wonderfully and the peg board held vegetables much better than anticipated. Despite this success, future iterations on this design would include food safe and water safe components to increase longevity of the product. Cosmetic changes to the outside would also help improve the overall appeal of the device and would be something to consider more in the future.

Technical Details

Schematic and Block Diagram

Code

/*
   60-223, Jen's Cutting Board
   Nicole Yu, James Kyle, Shuyu Zhang

   Description:
   This code encodes for a touch sensor (input), a stepper motor (output), and a 
   solenoid (output). When the touch sensor is HIGH, the stepper motor rotates a
   number of steps and the solenoid activates to HIGH. The solenoid deactivates
   to LOW right before the stepper motor finishes a set amount of steps. There's
   a delay right after the motor finishes rotating so the touch sensor can't be
   continuously activated. 
   
   Pin mapping:

   Arduino pin | type   | description
   ------------|--------|-------------
  12             input   Touch sensor signal
  7              output  Stepper motor enable pin
  9              output  Stepper motor step pin
  5              output  Stepper motor dir pin
  10             output  Solenoid pin

   Sources:
   1) Borrowed heavily from this Stepper motor library
   and code: https://www.arduino.cc/en/Tutorial/LibraryExamples/StepperSpeedControl
   2) Heavily reference this site to write the Solenoid:
   https://circuitdigest.com/electronic-circuits/solenoid-driver-circuit-diagram
*/

const int TOUCH = 12;
const int ENABLE = 7;
const int XSTEP = 9;
const int XDIR = 5;
const int BREAK = 10;

int rev = 145; //Number of steps stepper motor takes

void stepperFWD() {
  digitalWrite(XDIR, HIGH);
}

void setup() {
  pinMode(TOUCH, INPUT);
  pinMode(XDIR, OUTPUT);
  pinMode(XSTEP, OUTPUT);
  pinMode(ENABLE, OUTPUT);
  pinMode(BREAK, OUTPUT);
  digitalWrite(ENABLE, HIGH);
}


void loop() {


  if (digitalRead(TOUCH) == HIGH){
    digitalWrite(ENABLE, LOW);
    digitalWrite(BREAK, HIGH);
    delay(100);
    stepperFWD(); //Set direction of motor
    for(int i = 0; i < rev; i++){
      
      if(i > 100){
        digitalWrite(BREAK, LOW); //After 100 steps, release solenoid
      }
      motorStep();
      delay(10);
    }
    digitalWrite(ENABLE, HIGH);
    delay(2000); //After finishing steps, delay everything 2 seconds
  
   }else{
    digitalWrite(ENABLE, HIGH);
    digitalWrite(BREAK, LOW);
   }

}

void motorStep(){
  digitalWrite(XSTEP, HIGH);
  delay(5);
  digitalWrite(XSTEP, LOW);
}

 

]]>
Acrylic Pour Painting Platform by Pines: final documentation https://courses.ideate.cmu.edu/60-223/s2021/work/acrylic-pour-painting-platform-by-pines-final-documentation/ Fri, 14 May 2021 17:36:12 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13461 Title and introduction

For this project, small groups were connected with clients with physical disabilities to collaboratively create an assistive device that aids the client. Our client, Brenda, enjoys acrylic pour painting but has cerebral palsy that necessitates a lot of assistance to create the paintings. Our team (Daniela, Erica, and Tate) created a device that allows her to create these paintings by herself with ease. For more information about our initial meeting with Brenda, refer to: Initial client interview documentation.

What we built

Description

Our final device is a moving platform that allows her to move and tilt her canvases in space and manipulate them to her desire. Two motors raise and lower points of the canvas platform to create any desired combination of tilts. A joystick controls the movement and tilt of the canvas, and a washable two-layer plate and shroud setup allows her to clean the paint from the device.

Images

An overview image of the device with a painting we made using it.

 

Removable pegs make it easy to adjust for differing canvas sizes.

A metal universal joint allows the bed of the device to tilt at drastic angles to create desired effects with the paint.

3D printed arm extensions allow us to connect the servo arm movement to the top plate.

The joystick control box is fully detachable from the base, and the elongated joystick handle makes it very easy to manipulate with little force.

Narrative sketch

When Brenda wishes to make a painting, she first connects the wires of the joystick module to the platform, then plugs the power cord into the joystick module. After the platform aligns to flat, she moves the pegs on the top surface to fit her desired canvas size. She then loads the canvas onto the pegs. After this is completed, she can begin painting. She manipulates the canvas by using the joystick and adds paint as needed. Once she is satisfied with the painting, she removes it from the pegs to dry. Brenda unplugs the device from power and to the motors. She can then either let the paint dry on the platform surface or remove the plate, shroud, and pegs and wash them in a sink to remove paint. After these dry, she adds them back to the platform module. The device is now ready to be used again.

How we got here

Because our project relies on the interactive functionality of multiple sub-systems, we created three different prototyping areas that all integrate together for a working model of the whole system.

Prototype 1 (Tate)

This prototype was built to answer questions about how the canvas will attach to the device and whether we needed to implement a third DOF for the rotation of the device. The prototype was made with laser cut cardboard and wooden dowels. The holes on the cardboard interface with the 3D printed pieces that connect to the center joint so that the plate could be screwed directly onto the pieces. A slipping was also tested at the base of the center joint connector, but after considering the complexity that this adds to the system and Brenda’s existing solution of a lazy Susan, we decided to not move forward with it.

Pegboard prototype

Slip ring and baseplate testing

Because of the integrated nature of our three prototypes, other images and video of this device in use are shown in the later prototyping sections.

The prototype answered many questions for us and posed some new ones. The first being that it confirmed that a laser cut plate with an array of peg holes for variable canvas sizes would work. It also met the sizing requirements that we needed and didn’t contact the surface when the pitch was at it’s highest angle. We discovered that the weight of the plate could be an issue, especially when it needs to be modulated quickly by the servos. This was one of the motivating factors that led us to purchase higher torque servos for the final device. By actually integrating the prototype into the larger system it allowed us to check how it interacted with all other components and movement.

We had previously not considered the negative effects of the paint dripping into the internal components. This prototype visualized the system such that we designed the two-plate system used for the final device to create both a stopping surface for the pegs as well as preventing paint from dripping below. We also explored a slipping and a belt to allow for controlled rotation of the platform, but after discussing more with Brenda and doing loose fittings of the system, we found that it adds significant complexity without being much better than what Brenda currently uses, a lazy Susan. We received generally positive feedback for this prototype so we moved forward with most of our original design for the final device.

Prototype 2 (Erica)

The question that this prototype aimed to answer – How can we make a 2 degree of freedom platform with two servo motors? Will the combination of the joystick and servo motors move the platform in a way that could be used for pour painting?

I used popsicle sticks with brass fasteners, two servo motors, and a joystick module to construct arms that would move up and down to tilt the platform. The joystick module would let the user tilt the platform to a specific angle so that the paint would drip to different parts of the canvas.

I used the servo motors and popsicle sticks with brass fasteners to create arms for the platform. Here, Dani’s prototype is also pictured to compare the heights.

I connected the servos to the Arduino. At this point, we connected all the prototypes and tested code to see if the servos would move the platform the way we wanted it to move.

Here, the servo motors and joystick module are all connected to the Arduino. We tested code to ensure that the joystick could be used to tilt the platform.

Prototype development/feedback process

This video shows the initial run of the code after the servos, platform, u-joint (but not the joystick module) were all combined. The movement was initially very shaky but the servos were able to tilt the platform in two degrees as we intended.

Here is a video of the 2DOF motion platform that we were heavily inspired by for this project (https://www.youtube.com/watch?v=mVDGjfTJ4C8&feature=emb_title). One of the key questions I answered was whether this 2DOF motion platform model worked.

One noticeable aspect of our prototype that needed to be addressed was the connection of the servos to the bottom platform. Since the servos would move quite a lot, I had to use my hand to hold down the servo motor or else it would lift up quite easily.

Through this prototype, we confirmed that the 2DOF model we found worked well and that overall we should continue with using the servo motors and joystick as planned. To start, we found a 2DOF motion model online as inspiration for a simple but effective solution for our needs. We wanted a platform that could tilt at any angle with decent control so that Brenda could easily change how the canvas was tilted to drip the paint around. We found many solutions, some that even included a 6 servo motors, so we were unsure whether such a simplified model would work, but through testing with the popsicle stick arms, we found that this two servo model was sufficient for our purposes. Furthermore, we found that tilting the platform with the joystick module was quite intuitive since the platform would just mirror the motion of the angle of the joystick. Thus, we decided that we should continue using the Arduino parts that we initially planned.

Furthermore, by testing the code with the flimsy parts we had, we concluded and Brenda also noted that the movement of the platform was very twitchy which would not be ideal for Brenda to use. We attempted to smooth the code a little and prevent over-reading from the joystick module, however the movement still ended up being more sporadic than intended. We decided that at this point this may be caused mostly by the instability of the prototyped parts (ie. the servo motors had to be manually held down, the popsicle sticks would wobble) so we kept the current version of the code and only changed it after constructing the project again with the more stable parts.

Prototype 3 (Daniela)

The question that this prototype aimed to answer – What are the limitations of the motion of the canvas, and what range of angles are necessary?

My prototype consists of a universal joint (U-joint) made from popsicle sticks. This type of joint allows the canvas bed to move in two axes, which is what we were aiming for.

Here, the joint is imbedded into the product and is in the neutral position.

Here, the U-joint is rotating about the bottom pivot. The angle of rotation is very drastic.

The U-joint is pivoting about the upper joint now along the secondary axis.

The new 3D printed U-joint to increase stability.

A close up of the 3D printed U-joint to show that both axes pivot about the same point which is different from the popsicle stick version. This was more robust than the popsicle sticks, however its range of motion was more limited.

The second 3D printed joint (created after the initial prototype) did not survive long due to too much applied forced. This was created in an effort to increase the range of angles.

The final joint we ended up going with (after the initial prototype). The real u-joint from McMaster which provided much needed support while still achieving desired angles.

This prototype answered many questions, both expected and unexpected. To begin with, we found that the U-joint made from popsicle sticks was not accurate enough because the two pivot points were too far away from each other. This led us to 3D printing our own version of a U-joint that we found on McMaster Carr. This new joint was much more robust and introduced less error than the popsicle stick version. This was expected because the popsicle stick version was never meant to be a long term solution. However, even though the new joint was more stable, it did not have as much range as the popsicle stick joint which was a downside. Our next steps after the initial prototype included modifying the CAD model of our U-joint to allow it to reach larger angles of rotation to fit our needs. The modification worked for achieving the more drastic angles, but was still not robust enough. For our final design we chose to order the actual part from McMaster which worked very well in terms of both structural stability and range of motion.

An unexpected discovery from the initial prototype was that the joint should be placed much closer to the bottom of the canvas bed to better mimic the actual motion of a human tilting the canvas. In our second iteration we made the top support shaft shorter to account for this change. The feedback we received from our client was generally very positive, and the few concerns she had did not have to do with the joint.

Process

First sketches of the concept.

Most of the ideas from the initial sketches were used in the final project. We used a pegboard to mount the canvas on the platform, used two servos and a joystick module to control the tilt, and used a U-joint at the center. Some notable divergences from the original sketches are that we removed a degree of motion (being able to mechanically rotate the whole platform) since we thought we already had enough complexity in our project and Brenda noted that she already had a lazy Susan she could use to the manually turn the whole project.

Our third universal joint stress break

Laser-cut hole alignment fitment

Ball joint arm cracking

6 pin port alignment and soldering

Testing the more robust metal universal joint. The servos are meant to be horizontal to the table in this image, but we were accidentally not providing enough power to the motors so they were acting strange until we solved the problem.

We followed our original schedule as outlined on the Gantt chart relatively closely. Since we had many parts to our project, we knew that we needed to order parts early in order to bring everything together and test. We managed to keep our original goals since we started off each class or meeting by outlining things that needed to be completed and communicated with each other well. We did encounter some setbacks with our 3D printed parts cracking, however we were able to replace those relatively quickly with another 3D printed part, adding more support, or by purchasing a more mechanically stable part.

Conclusions and lessons learned

Final critique

We had one comment that expressed “Slightly worried about the stress on the servos- could break the tip off easily.” Similarly, another person asked, “How strong are the linkages on the servos? I’m not sure how much of a problem the strength of them would be when working with larger paintings (12”x12”).” Our initial connection to the servo motors (the ball joint) did break so we already strengthened the joint by adding more connections. Although the connection looks like it is mostly 3D printed, the 3D printed part mostly serves the purpose of extending the standard servo arm while the actual connection is strengthened by tight mechanical connections with the screws and mechanical ball joints. Thus, we understand that there is a concern and if we could rebuild the project we would make the design be better customized to not stick out as much, however the connection should be strong enough to hold a substantial amount of weight.

There were also concerns about the paint getting onto our device. Specifically, someone noted “Is paint dripping off the platform an onto the base/servos a concern?  Even with a plastic cover.  Maybe a rim around the platform to catch drips/spills?” If we had more time, building a base would probably be very helpful for Brenda since she currently uses puppy pads to protect her work surfaces, so if we were given more time, including something to catch the paint would likely be helpful.

Finally, we had some comments about the movement of the platform. “The rotation is a little fast, and it would only be an issue if it changed how the paintings looked. It might be a desirable effect, but the flexibility of rotation speed could be nice.” We focused more on the smoothing part of the code rather than adding speed variability. However, given the chance it would be interesting to continue testing and adding speed flexibility in the tilt to see how it affects paintings.

Overall, the feedback was relatively positive and we were happy to see that people enjoyed the final result.

Working remotely

Working remotely with our client, Brenda, was a little bit of a struggle because it was harder to contact her at times, however it was a relatively pleasant experience and we were very happy to work with her. Since Brenda was unfortunately sick at the time of presenting our first prototype, we were not able to get any feedback right away from Brenda, but we still had to order parts or else we would not be able to finish the project on time. We were concerned that Brenda would not approve of the prototype; however we had no choice but to continue with our project because of the time constraints. Luckily, after meeting with Brenda about a week after presenting our prototype, Brenda really loved our prototype, had minimal critiques, and no comments that would drastically change any of the mechanical components.

Overall, we really enjoyed working with Brenda because she was very enthusiastic about the project from the start to the end. She gave very concise and helpful feedback that made it easy for us to understand her goals for the project and adjust accordingly. Also, seeing her reaction to the final project we made was very rewarding and we were very grateful to have the opportunity to make something for her.

Concluding thoughts

Overall, we really enjoyed working on this project. The prompt we received was a little more creative which made testing the final product very exciting. Here are some of the pour paintings we were able to make while testing our device.

Since our backgrounds were quite varied (our majors were Industrial Design, Mechanical Engineering, and Information Systems), we were able to work off our strengths and combine our skills very effectively throughout this project. The individual parts all took a while to construct, and sometimes we had to make parts twice to ensure that they were durable, but at the end they all came together very nicely to make a very strong final product.

Technical details

Block diagram and schematic

Code

/* Acrylic Pour Painting Servo and Joystick Code
   By: Daniela, Tate, Erica 
   Last Updated: 05/13/2021

   Description:
   This code was developed
   for a school project to allow
   our client with cerebral palsy 
   to make acrylic pour paintings 
   without assistance from another person.

   Electrical Components:
   1 arduino uno
   2 high torque servo motors
   1 joystick

   How it works:
   The joystick x and y directions
   determine the motion of one servo each.
   

   Pin mapping:
   Arduino pin | type   | description
   ------------|--------|-------------
   9            OUTPUT   high torque servo (x direction)
   10           OUTPUT   high torque servo (y direction)
   A1           INPUT    x signal from joystick
   A2           INPUT    y signal from joystick
   2            INPUT    button press  from joystick (not used in project)

  Some of the Joystick code originally from:
  https://create.arduino.cc/projecthub/MisterBotBreak/how-to-use-a-joystick-with-serial-monitor-1f04f0
*/

#include <Servo.h>

// SERVO VARIABLES //
Servo xServo;  // servo controlled by x motion on joystick
Servo yServo;  // servo controlled by y motion on joystick

int xServoPin = 9;
int yServoPin = 10;

int maxServoAngle = 180; // ranges of servo motion
int minServoAngle = 0;

int minAngle = 5; // prevents the servos from physically going too far down


// JOYSTICK VARIABLES //
const int joyXPin = A1; // Analog input pin for X movement
const int joyYPin = A2; // Analog input pin for Y movement

// Variables to keep track of the current and previous positions
int joyXPos = 0;
int joyYPos = 0;
int prevX;
int prevY;

int joyMax = 1023; // maximum value from the joystick

int noiseLimit = 1; // denoising (increase value to increase steadiness when bed is flat)


void setup() {
  // start serial communications
  Serial.begin(9600);

  // Setup the joystick inputs
  pinMode(joyXPin, INPUT);
  pinMode(joyYPin, INPUT);

  // attach servos motors and set to initial angles
  xServo.attach(xServoPin);
  yServo.attach(yServoPin);
  xServo.write(90);
  yServo.write(90);
}

void loop() {
  // Get the current joystick states
  joyXPos = analogRead(joyXPin);
  joyYPos = analogRead(joyYPin);

  // Send the data over serial for debugging
  Serial.print("X: ");
  Serial.print(joyXPos);
  Serial.print(" Y: ");
  Serial.print(joyYPos);

  // map the joystick input to servo output in degrees
  int xpos = map(joyXPos, 0, joyMax, minServoAngle, maxServoAngle);
  int ypos = map(joyYPos, 0, joyMax, maxServoAngle, minServoAngle);

  // move the servos to the position determined by the joystick
  // only move if position change it greater than the noise limit
  if (abs(prevX - joyXPos) > noiseLimit && xpos > minAngle) {
    xServo.write(xpos);
  }
  if (abs(prevY - joyYPos) > noiseLimit && ypos > minAngle) {
    yServo.write(ypos);
  }

  // Store the previous joystick position
  prevX = joyXPos;
  prevY = joyYPos;

  // Delay to not send messages too fast and confuse the servos
  delay(50);
}
]]>
You’ve Got Mail! by the Oaks: Final Documentation https://courses.ideate.cmu.edu/60-223/s2021/work/youve-got-mail-by-the-oaks-final-documentation/ Fri, 14 May 2021 00:19:39 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13552 INTRODUCTION

As a team, we (Carlos Ortega and Amelia Lopez) worked with Annie Verchick  to develop a gadget suited for her unique needs. Checking mail was always a challenge for Annie, and at one point she had six weeks worth of mail piled up. Included in that mail was time-sensitive information and money which she missed due to not checking the mail on time. Clearly, this was a problem for her that desperately needed a solution.

We ended up designing a simple, elegant device so that Annie could see at a glance if she received mail in her mailbox without having to go outside. From inside her house, she could check the lights on a wall mounted box or read from an LCD display also on the box which displayed either “you’ve got mail” or “no mail.” Further on in the documentation we discuss the specifics of the design. For previous designs/problems we considered solving for Annie, please see our interview documentation linked here where we spend time explaining why we settled on solving this problem for Annie and what our initial sketches looked like.

WHAT WE BUILT

Our device, “You’ve Got Mail”, checks if there is mail in our client’s mailbox. It does this through 2 or 3 sensors we place strategically inside the mailbox Annie has mounted to her wall outside. These sensors tell us if there is something inside the mailbox. Inside her house we’ve built another box which has two lights (1 green and 1 red) and a screen display. If there’s mail, the screen display changes to say “you’ve got mail” or “no mail” if there is no mail. The light turns green when there’s mail or stays red when there is no mail. The box inside is powered through a wall outlet so the display and lights are always on. 

Images of final device. The upper left is the mini-box that would go inside the house; the remaining images are a demo of her mailbox with the sensors inside. The lower left shows what’s inside the mail box (3 sets of break-beam sensors positioned across from each other: top-bottom, left- right, and left-right). The lower right shows a an isometric view of of the box with an engraving, and the upper right picture shows a front view of the mailbox with an envelope engraving and a slit for the mail to go through

Video of the final product:

Narrative Sketch

It’s a cloudy, but bright-feeling day in the Sierra. The snow covers the seldom-used front porch and the wall-mounted mailbox next to the wooden door, which has also been touched by the snow. Meanwhile, Annie and her dog grab the leash and take their time out the side door for an exciting morning in the snow. 

Just as they leave, the mailman has been on the opposite side of the house, stuffing the mailbox with the long-awaited family letters and packages. At last, when Annie and her dog return through the old side door, the green light on Annie’s front wall catches her eye. She reads on the newly-installed screen the words “Mail status: You’ve got mail!”

As she eagerly opens the door to open her letters, she sees the mailman on his way back, and they exchange a sweet good morning.

HOW WE GOT HERE (PROTOTYPES AND PROCESS)

Prototype 1: Carlos Ortega

How can we activate the mail checker on command, and from any distance? This prototype uses a wireless remote and an infrared receiver; the mission was to learn how to program them.

How it works: pressing any button on the remote turns (both) lights on. They automatically turn off after 5 seconds. The user can turn the lights off sooner than 5 seconds with the power button on the remote. The final product would decide which single light to turn on (instead of both), but that would come later.

In action: the remote turns the lights on and off.

Circuit reports back at every remote control click.

Close-up of the circuit.

Prototype 1 in action

(GIF of prototype 1 in action. WordPress may fail to display…)

Intermediate steps
Left: programming the remote and remote receiver. Right: turning on lights

The lights wouldn’t stop flashing!

The code reports back which part of the program is currently executing.

The most significant feedback we received: don’t use the remote. The remote came out of our assumption that the mailbox was out in Annie’s yard, far away from her porch, and down a flight of stairs. We were surprised to find out that the mailbox was actually mounted on the wall. Instead, we decided to also mount our gadget on the same wall, on the interior side. The  remote, then, would have been overkill, so we decided to scrap that feature.

Based on some feedback, we were also going to add an arcade-style push button for the dog to check the mail with his paw. It was just for fun, but we scrapped that extra feature too before we even started implementing it.

Even so, prototype 1 really did answer the question of how do a remote control and a receiver work? The remote sends pulses of infrared light of different durations. When combining pulses of different lengths, we can recognize those patterns, and encode information; each button press sends out a different pattern. The infrared sensor receiver detects the pulses, sends them to the Arduino, and the Arduino can then extract the information from those pulses.

We had to borrow some pre-existing code (called a library) in order to read these patterns from the remote. It was a surprise to see that the example code on our class website used a previous version of the library, and was therefore outdated. I had to look at the documentation for the newer version.

Process

Using a screen for the first time!
Left: not enough cables! Right: plugged straight into a breadboard.

Left: LCD screen working. Right: LCD screen not working.

Mysteriously, the only cause was uploading the code to the Arduino again! With no changes whatsoever to the code! The solution was just turning the Arduino off and back on again.

Designing electrical box in CAD

(Process continued in Amelia’s process section)

Prototype 2: Amelia Lopez

This prototype was designed to help answer the design question: can we use break-beam sensors to ensure mail is accurately detected in the mailbox every time?

To answer this, I designed 2 boxes: one of which served as a representation of Annie’s mailbox where I could put the break-beam sensors, and the second box served as the one which would be inside and had the LCD display as well as 2 LED lights. We were confident the break beam sensor was an ideal device for this project because they work by having an emitter send out a beam of IR light and a receiver receives the light from across. When something interrupts the light, the beam is broken and we know that there is some object in the way, preferably mail.

This is my mailbox prototype. I designed the box using AUTOCAD and then laser cut it out of carboard in order to get the design right. The left image shows me testing the output of the sensor by putting in a sheet of paper. The upper right image shows one set of break beams I positioned in the box (top-bottom). The bottom right shows a front view of the box.

This is my mini-box prototype. The left image shows the box sealed, the center image shows a top view of the box with the LCD screen and LEDs. The right image shows a view inside the box with the Arduino and wires attached.

There are also earlier iterations of my prototype which accomplish the same idea but look more “rough”

This picture shows an early version of the prototype where I was testing the break beam. The upper 3 images show the LED/LCD status when the break beam is unbroken/ no mail. The bottom 3 images shows the LED/LCD status when the break beam is broken/mail

 

In my prototyping process, I found that break beams are a great sensor for this particular problem but I needed more than one break beam to ensure the device accurately detected every piece of mail which was surprising. If I positioned the mail too far right or too far left, I discovered that my break beam going from top-bottom didn’t catch that piece of mail. To solve this, I added 2 more sets of break beams (for a total of 3 break beams) both going from left-right but positioned either far left or far right.

Another part of the prototyping process was designing the mini-box that would go inside Annie’s house. We were looking to simplify and clean up the design which meant using a caliper to measure the size of the LCD display and diameter of the LEDs. To mount the LCD, originally we were going to use screws on either corner but found that it took away from the cleanness of the design and settled on using tape on the bottom of the display to hold it together instead so that the front view would only consist of the LEDs and LCD.

In terms of feedback from our client, she was very happy with the design and pushed us to integrate a wall-power adapter instead of batteries in order to power the device. She has many outlets in her home, and this was a simpler solution for her. This ultimately made the design of the mini-box smaller because we didn’t have to worry about storing a battery and caused us to add a square-sized hole to the final design of the box box in order for the adapter to fit through.

Process (continued)

Project management for final build

We stuck to the plan surprisingly well. Making a Gantt chart is immensely useful for project management, as it gives you a super concise way of gauging where you are relative to everything you’ve done and still have to do.

CONCLUSIONS AND LESSONS LEARNED

From the final critique, there were several key findings we discovered from the feedback process. One included the fact that our simple approach to the mini-box was very effective. One critiquer stated “The concept is simple and easy to understand. It also seems to work consistently well.” which we felt was fair. Another positive reviewer stated “the red/green was very simple but effective, really liked the final product!” which we enjoyed seeing. Overall, it seemed that the clean design fulfilled its purpose and wasn’t overly complicated.

In terms of what could’ve made the design more robust, reviewers stated “maybe send a notification to Annie’s phone when she gets mail”, “how easy would it be to read the red / green notification LED from a  distance? Maybe an audible alert could help with that” and “it looks like the feedback display is connected to the device by wires so maybe making it connect over Bluetooth or Wifi might improve the flexibility of the device” which are all points we would think about changing in a future iteration of the device. Sending a notification to Annie’s phone would certainty help adapt the device and would probably get rid of the mini-box inside her home so that she would just see the status of her mail through her phone.

A favorite piece of feedback from the critique session (it was given verbally, so it’s hard to quote but this was the idea):

It looks too much like a security system. It could be much more user-friendly; less austere.

I completely agree. We could solve this by making the electronics box rounder, maybe even out of some other material, in another color. The idea would be to make the gadget more huggable.

Working remotely is really difficult, especially working completely off campus, and even more so in different time zones! It was difficult to work without the amazing resources CMU makes available. It makes me so much more grateful for being at CMU!

Having one WordPress editor made it really hard to write the documentation. After having worked on two separate builds, we developed our own unique take on the project, with our own different process. Sometimes it was difficult to progress as a team, rather than as two individuals. Especially while making the documentation, and WordPress’s limitations exacerbated that.

In terms of lessons learned, I believe we were both pleasantly surprised how such a small and simple solution could impact Annie’s daily routine. Not having a disability makes you unaware of all that you take advantage of, and simply walking up to your mailbox everyday is a convivence not everyone is afforded with ease. When we were showing our design to other clients of the class, they reasoned with this hardship and stated that our solution was simple yet effective. Annie described life as being incredibly DIY; she constantly has to find her own solutions to her own challenges, that professional therapists can’t solve. This is why it was so important for Annie to work with a group of very creative young students who are able to solve these problems. We are privileged to have been able to design a solution to help Annie with this task and hope it improves her life.

For next time, we could use a smaller Arduino, like the Nano, for example. For such a simple task, the Arduino Uno feels like overkill. However, the box is already a good size for user interface purposes, so the box itself might have to stay about the same size in order to keep it “huggable.”

Last important lesson: keep it simple.

TECHNICAL DETAILS

Mailbox and gadget concept art

Schematic and Block diagram

Code

/*

  60-223, Final Project
  Amelia Lopez (aelopez)
  Updated by Carlos Ortega on May 2

  Description:
     Reads from 2 breakbeams;
     if at least 1 of them is broken, turns on a green LED
     and updates an LCD screen.
     Turns on the red LED otherwise.

  Challenges:
     Incorporating the IR receiver code would've been challenging,
     but fortunately we didn't use that code at all.
     Deciding how many break beams to incorporate

  Next time:
     Use enumeration type for lcdStatus instead of strings.
     Another version of the code uses 3 breakbeams instead of 2.

  Pin mapping:

   Arduino pin | type   | description
   ------------|--------|-------------
   13            input     BREAK BEAM SENSOR 1
   12            input     BREAK BEAM SENSOR 2
   ~11           input     INFRARED REC. PIN 
   ~10           output    GREEN LED 
   ~9            output    RED LED 
   7             output    LCD RS
   6             output    LCD E
   ~5            output    LCD D4
   4             output    LCD D5
   ~3            output    LCD D6
   2             output    LCD D7

   (digital PWM~)


*/
#include <LiquidCrystal.h> 


//DECLARING ARDUINO PINS
const int SENSOR_PIN1 = 13;
const int SENSOR_PIN2 = 12;
const int GREEN_PIN = 10;
const int RECV_PIN = 11;
const int RED_PIN = 9;
const int LCD_RS = 7;
const int LCD_E = 6;
const int LCD_D4 = 5;
const int LCD_D5 = 4;
const int LCD_D6 = 3;
const int LCD_D7 = 2;

//VARIABLE FOR BREAK BEAM
int sensorState1 = 0;
int sensorState2 = 0;
String lcdStatus = "waiting"; 

//DECLARING INTS FOR LCD WAITING TIMES
const int WAIT_TIME = 300;
unsigned long timeVariable = 0;

// initialize the library by associating any needed LCD interface pin
// with the arduino pin number it is connected to
LiquidCrystal lcd(LCD_RS, LCD_E, LCD_D4, LCD_D5, LCD_D6, LCD_D7);



void setup() {
  //output pins
  pinMode(RED_PIN, OUTPUT);
  pinMode(GREEN_PIN, OUTPUT);

  //input pins
  pinMode(SENSOR_PIN1, INPUT_PULLUP);
  pinMode(SENSOR_PIN2, INPUT_PULLUP);
  digitalWrite(SENSOR_PIN1, HIGH); // turn on the pullup
  digitalWrite(SENSOR_PIN2, HIGH); // turn on the pullup

  // set up the LCD's number of columns and rows:
  lcd.begin(16, 2);

  //print constant letters on screen
  lcd.setCursor(2, 0);
  lcd.print("Mail status: ");

  Serial.begin(9600);
}

void loop() {
  sensorState1 = digitalRead(SENSOR_PIN1);
  sensorState2 = digitalRead(SENSOR_PIN2);

  // turn green LED on if beam is broken (there's mail):
  if (sensorState1 == LOW || sensorState2 == LOW) {
    Serial.println("Broken");
    digitalWrite(GREEN_PIN, HIGH);
    digitalWrite(RED_PIN, LOW);
    lcdStatus = "You've got mail!";
    Serial.println("sensorState1: ");
    Serial.println(sensorState1);
    Serial.println("sensorState2: ");
    Serial.println(sensorState2);
  }
  //turn red LED on (there's no mail)
  else {
    Serial.println("Unbroken");
    digitalWrite(RED_PIN, HIGH);
    digitalWrite(GREEN_PIN, LOW);
    lcdStatus = "    No mail     ";
  }

  //UPDATING VALUES ON LCD
  if (millis() - timeVariable > WAIT_TIME) {
    //updating input sensor value
    lcd.setCursor(0, 1);
    lcd.print(lcdStatus);

    timeVariable = millis();
  }
}

fast forward symbol from: 410160-middle.png

]]>
Button Activated Blinds Operator by Team Laurels: Final Documentation https://courses.ideate.cmu.edu/60-223/s2021/work/button-activated-blinds-operator-by-team-laurels-final-documentation/ Wed, 12 May 2021 16:54:40 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13392 This particular assistive device was developed as the final project for physical computing where 3 students worked closely with our client, Elaine Houston, to create something that would improve their ability to complete daily tasks. Starting off this process process with an interview, we were able to get a sense of what types of devices would suit Elaine’s needs best. More information on this interview process can be found in our interview documentation page. After narrowing down our list of devices we developed a prototype that we then presented to Elaine for initial feedback. With this feedback as well as continual input from Elaine throughout the process, we arrived at this final button-activated blinds operator design.

What We Built

This device allows Elaine to raise and lower her blinds at the push of a button. Incorporating Elaine’s dog Oak into the final design, once Oak pushes the button, the blinds operator will rotate the curtain chain in a certain direction to raise or lower the curtains until they are fully open or closed. The direction of the spinning is dependent upon whether the curtains need to be raised or lowered. Finally, while the blinds are in the process of raising or lowering, the button is disabled.

Here’s a gif of the blinds operator working. While this video is faster than real time, if you look closely, you can see the rotation of the motor arm slow start and stop feature.

Details

The inner compartment of the blinds. Pictured is the primary tensioning mechanism consisting of the threaded rod in the center of the motor mount, the acrylic motor mount, and the secondary tensioning mechanism consisting of the spring connected to the link. This secondary tensioning mechanism is used to keep tension on the ball chain once it motor mount is adjusted to the correct height with the primary tensioning mechanism.

The laser-cut gear mechanism plus the motor mount which will interface with the ball-chain and turn it.

Outside view of the power jack adapter for the power supply and the barrel jack adapter for the button.

A side-angle-view of the entire device. The knob at the top (not pictured) controls the position of the inner motor mount by turning the threaded rod.

Narrative Use Description

Elaine is relaxing at home and doing some work at her desk when she realizes that the blinds are open. Seeking to reduce the glare from the sunlight, Elaine would like to close the blinds. Since her aide is not there at the moment, she calls for Oak, her service dog who is always nearby. Elaine gives Oak the command to press the large button by her desk, and Oak complies quickly, understanding very easily where he should press. This button-press triggers the motor attached to the ball-chain on the blinds, slowly lowering the canvas to cover the harsh sunlight.

The next morning is a beautiful day, and Elaine would like her blinds open again. Giving Oak the command again, Oak walks over to press the button as Elaine sits back and watches her blinds open up again.

How we got here (prototype and process)

Eric

Button Switch

I started my portion of the blinds opener by prototyping the question:

  • How would the opener best be operated by Oak?

We settled on a push button as the switch that Oak would press to activate the blinds opener. The physical computing lab had small arcade pushbuttons, which I decided to use as a base for a larger 3D printed adapter to make the button bigger, and therefore easier for Oak to press.

CAD Model of the Button Housing.

Although this prototype was physically the size we wanted, certain dimensions were too small, causing the button to be permanently pressed down, and as the print for the button part failed and it would take several days to a week to print another, I decided to take a different route.

The 3D printed housing was supposed to make an arcade pushbutton larger but proved difficult to design quickly.

Our client Elaine suggested that we use a manufactured answer button as the basis of our button switch, and “hack” it into a powerless switch that interfaced with the opener through the use of a 3.5mm aux cable. We ended up using this button switch due to its reliability from using a premade, proven-to-work mechanism whilst serving the same function as anything we could 3D print.

Building the answer button switch.

The final working pushbutton switch.

Secondary Tensioner Mechanism

Since we would not have access to Elaine’s blinds due to COVID restrictions, getting exact dimensions of the blinds chain was very difficult, so we built multiple tensioning mechanisms as a safeguard for the opener to work regardless of measurement precision. Our first tensioner was made of K’nex, plywood, and a few hardware bits.

Jud and I redesigned the tensioner later on to make it more compact and interface with the motor sprocket more easily. Here’s a mockup of how the tensioner would attach to Jud’s motor mount box.

Dominique

This prototype was designed to help answer the design questions:

  • What mechanism is sufficient for gripping and moving a stiff chain on a set of blinds?
  • What might the motor encasing look like?

I created some models of the encasing and the meshing mechanism and iterated over a period of time, slowly focusing more on just the meshing system. The meshing system interfaces with the ball chain on Elaine’s blinds and had to be adjusted to fit those measurements. The encasing took into account the measurements of Elaine’s blinds and how it would attach to he wall.

sketch of overall plan for prototype

Meshing Sprocket System

first sketch of meshing gear

Fusion Model of version 1 of meshing system

3D printed prototype of meshing system

second sketch of meshing gear, this time with threads because it would mount better with the motor that my teammate found

version 2 3D CAD design. Biggest problem with this is that the threads are too close to edges and measurements in general are off. You can see here that I got rid of the pegs protruding around the gear circumference and designed a sort of alternating plate cut-out. This design was inspired by facory-made blind rollers you might find in IKEA, and is less likely to make the chain get stuck.

finalized 3D CAD design, fits to specified motor and threads were aken out and replaced with holes for bolts because we were laser cutting, not 3D printing, so threads would only cut the smallest circumference of their spiral, making it impossible to put or bolts in. Also, I realized the diameter of the inner pattern needed to be smaller than the outher plates in order to hold the chain better. Discovered this after looking at the IKEA example again. In this particular gif, I have my design cut open so I could get this inner face for laser cutting purposes, which is why you’re seeing double.

laser-cut meshing system bolted to mounting hub

Encasing

finger-jointed encasing early 3D CAD design to hold the Arduino, protoboard, and motor while also fitting on a window sill. Openings for power and chain. dimensions set up as adjustable parameters. we did not end up pursuing my design much further than this.

To my first prototyping question, I discovered that it would be best to just go off an existing reference since people have invented an automated blinds operator before, so I can just customize it to Elaine’s specific circumstances. It was kind of shooting in the dark because we did not know her exact measurements (distance between balls on chain, diameter of balls, etc.), but we settled on some agreed measurements, 4.5mm diameter ball, a common size. For my second question, even though my design was not used, we were able to use its crit to answer the question of what dimensions would we need to consider for the encasing and how far up on the sill it would need to be, things like that.

The crits did not really focus on my meshing system, however further research on motors and mounts by my teammate, Jud, helped me figure out how to iterate. The pegs did not work, but the final design did. I did not ignore any feedback. During my prototyping process, the biggest surprise was just how many times I had to adjust the size of the threads in the meshing system, due to miscalculations, changes in motor, and so on. And I don’t know why, but changing thread size in Fusion is not as straightforward as I would imagine it might be, and it always took me a long time. I just taught myself Fusion this semester so I was probably also grossly overcomplicating it.

Jud (Motor Behavior and Torque Requirements)

This prototype was built to answer the question of what would the final behavior of the mechanism look like and what components should be used.

This prototype demonstrates the raising and lowering action at the press of the button. As shown in the video below, the curtains will be opened by pressing the button once and then closed by pressing the button again shown by the different rotating directions. As the curtain is raised, the distance that travelled by the curtain is measured ensuring the curtain is completely raised or lowered before stopping.

Overall

Overall image of general movement prototype. Shown is the motor attached to the H-bridge motor driver for speed control as well as the wiring to the Arduino Mega.

Motor Driver Wiring

Close up of the wiring for the prototype. Here, the L293D H-Bridge is being used to drive the attached motor with a 12V power supply.

Speed Control Calculations

Screenshot of the calculations controlling the speed of the motor. As shown, the speed of the motor slowly ramps up until the max speed is attained and then slowly ramp downs at a certain threshold distance away from the final distance.

Video

Improvements

3D model of the motor mount used for the final design. As shown, the motor mount has more adjustability to allow for more accurate positioning of the motor with respect to the ball chain.

Image of the power supply connected to the outer housing. As suggested by Elaine, this style of power adapter was used because of its ease of use. The barrel jack style adapters allow for easier plugging and unplugging as well as more customization in the future with regards to cable lengths.

Screenshot of the specification sheet for the final motor. As shown, the motor has more than enough torque to raise the curtains ensuring the reliability of the device.

After this initial prototyping, we found that the best motion for the motor would be a slow starting and stopping motion in order to reduce the damage to the motor. In addition, this type of motion would result in a smoother looking operation thus improving the overall appearance of the device. In addition, from the calculations for the torque required to lift curtains similar to Elaine’s, we discovered that the motor would need to be able to lift around 13lb. With this number and the dimensions of our final gear mechanism, the motor we needed to order for the final device needed to be capable of outputting around 20kg*cm of torque. This amount of torque was higher than the minimum required torque in order to ensure that our device would work reliably.

After the prototyping critique, we received lots of feedback on different ways to improve the device including different button pressing actions that would allow the curtains to stop midway through if the button was pressed again. While this type of behavior is something that would’ve been useful to have, we decided to leave this feature out of the final device because we ran out of time to incorporate it. In addition, we also received lots of feedback on ways we could make the device easier to use by reducing the number of electrical connections necessary to power the device. Since the final device needed a 12V power supply for the motor and a 5V power supply for the rest of the electronics, we decided to incorporate a 5V regulator into the final device that would be able to take the 12V from the motor power supply and convert it to 5V for the rest of the circuitry.

Process

preliminary sketch of project for scoping

We started out with three ideas after our first interview with Elaine, but of course, this is the one we chose, sketch courtesy of  Eric.

Our Gantt chart

We tried to organize or plan according to our abilities and also with the idea in mind that our building would be delayed until our motor came. We figured that it would show up the week we were planning it all out, but that was not the case and we had to adjust our scheduling to fit this delay.

Amazon sprocket example screenshot

The design that inspired Dominique’s CAD designs. These measurements are also what we originally went with, but we adjusted the space provided for in-between the balls a bit.

tensioning sketch

We also had to take time to consider what tensioing on the chain and motor might look like so that the movement can be smooth. This drawing by Jud is for the secondary tensioner, and was later prototyped by Eric.

the motor that changed our sprocket design

Jud found this motor, which he determined had a better mounting system for our purposes, and so Dominique redesigned the model of the sprocket with this in mind, although we did not end up going with threads.

unfortunate email

Not long after, we did get word that our motor delivery would be delayed until after the final crit, and so we had to improvise with a different motor from IDeATe Lending. Everything worked as planned, but it would be good to switch the motor out for Elaine.

Conclusions and lessons learned

Crit Feedback & Response

  1. “It was difficult to understand exactly how it would interface with the blinds from the presentation just cause there was no chain to demo with.”
    1. This is a fair point, it would definitely be clearer to have had a chain on there. The hard part is getting the chain for a set of blinds in a remote setting. I think at least what we could have done for clarity sake would be to use a rope or string just as a demonstration of how everything would be placed including how the sprocket interfaces with that rope. This way people would not just be looking at a spinning motor arm. In addition, having the actual motor would’ve been very useful in conveying the actual functionality of the device which was unfortunately not possible for the critique.
  2. “Having all four rods be threaded seems like you’re asking for trouble? Why not one threaded rod and three smooth ones? This seems like it could get locked up by accident really easily.”
    1. This is a very good point, however, the four threaded rods used in the four corners of the motor mount were only used as aligning rods. In our original design, springs were supposed to help keep the motor mount in place on all four rods, and since we didn’t have enough springs to place the motor mount high enough, we needed a way to raise the first set of springs up slightly. In order to do this, threaded rods with nuts attached were supposed to be used, however, this mechanism was eventually replaced with nylon couplers. In the future, it would probably be better to use smooth rods for all four rods to reduce the risk of the motor mount catching during tensioning since the threads ended up not being utilized in the final design.
  3. “The 3D-printed sprocket gear could use some fillets at the base of the teeth? They look like they are at risk of snapping off. Forces will build up otherwise.”
    1. The final sprocket design is actually quite thick, at least one centimeter and is made out of acrylic and is reinforced by a metal mounting hub. When comparing that build to the thinner IKEA version which was actually just plastic, our sprocket is actually much stronger, so assuming that the factory version is still sturdy enough to interface with a chain, we believe our build would be more than fine as well. Also, the direction of the force will never push directly on the protruding alternating plates. Even still, this is a great note for our first version of the sprocket.
  4. “Have you considered the fire safety of your sprocket? The constant rubbing might make it catch fire easily.”
    1. That’s also a good note, we had not actually considered this. However, I don’t think acrylic catches fire…it seems more like it would melt. But I’m not sure about that and it would be good to look into for safety purposes.

 

Working remotely

It was challenging to communicate in a timely manner. Elaine herself was going through a lot and was telling us about how Oak was quite sick and in the ER. She was also very busy in general and was not able to get back to us very often making it hard to have a smooth design process. On the other hand, working remotely made it super easy to hop on a call and get specifics about measurements and things, rather than having to go all the way to he home to see what her set up looked like.

If there is one thing we might have done differently, I would say that it’s get our laser cutting done much earlier. It was not fun rushing to laser cut our sprocket before the lab closed. In addition, it would’ve been very useful to assess the components necessary for the final build a lot sooner so the final product on critique day would be fully finished.

Working with a disabled client

Working with Elaine was such a pleasure. She has limited mobility but is otherwise very independent and has a lot of great ideas. It did help that she is an expert in robotics and engineering topics and had a lot of great suggestions for us on how to simplify our project. One major takeaway we learned is that it’s super important to consider how processes can be made easier for different stakeholders, so in this case not only Elaine, but her dog. Assistive devices do not have to be operated by humans all the time. It was also a very rewarding experience thinking about solutions to problems in different was than we would otherwise. This new way of thinking led to many different new and creative ideas that we will be able to take forward into future projects and life in general.

Concluding thoughts

Next time, it would be great to have a backup plan for parts arriving late. Perhaps we could have designed two sprockets one that fits the motor we have now, and one that fits the one still coming. Overall, this was a great learning experience and a fun project, and we hope that Elaine can get some use out of her new system!

Technical information

Block Diagram

Functional block diagram of the dog activated blinds opener.

Electrical Schematic

Electrical schematic of the dog activated blinds operator.

Code submission

/*
  60-223, Final Project
  Jud Kyle (jkyle), Eric Zhao (ezhao), Dominique Aruede (daruede)
  time spent: 24 hours

  Collaboration and sources:
  - Encoder_Interrupts functions copied from 24-354 Gadgetry lab code

  Description:
  - This sketch drives an automatic blinds opener. When the button is pressed,
    the blinds will begin rotating in one direction for the desired length until
    entered as the curtainHeight in mm. When the motor starts rotating and stops
    rotating, the speed is slowly increased and decreased to reduce the negative
    impacts on the motor. Each time the button is pressed, the motor will not stop
    rotating until the full distance has been travelled. The direction the motor spins
    changes upon each button press to go up if it had previously gone down and vice
    versa. 

  Pin mapping:
  Arduino pin | type   | description
  ------------|--------|-------------
  2             input    Button input pin
  3             input    Encoder A input pin
  4             input    Encoder B input pin
  8             output   Motor direction pin 1
  9             output   Motor direction pin 2
  10            output   Motor PWM pin for speed
*/

// Pin Variables
const int BUTTONPIN = 4;
const int encoderPinA = 2, encoderPinB = 3;
const int motorPin1 = 9, motorPin2 = 8, motorPWM = 10;

int encoderCount = 12 * 98; //12 encoder counts per revolution with 1:98 gear ratio

bool motorSpinning = false;

// Motor distance measuring variables
volatile long currentEncoder_pos_f = 0, prevEncoder_pos = 0, revs = 0;
float curtainHeight = 0.5, motor_D = 0.02, thresholdDist = 0.0255; //mm
float motorSpeed = 0, motor_pos_f = 0;

// initialize the stepper library on pins 4 through 7:

// determines which direction the motorspins upon button-press
String motorDirection = "CW";

void setup() {
  pinMode(BUTTONPIN, INPUT_PULLUP);

  pinMode(motorPin1, OUTPUT); pinMode(motorPin2, OUTPUT);
  pinMode(motorPWM, OUTPUT);

  // initialize the serial port:
  Serial.begin(9600);

  //Initialize interrupts
  attachInterrupt(0, encoderA, CHANGE); //Pin 20
  attachInterrupt(1, encoderB, CHANGE); //Pin 21

}

void loop() {
  int buttonVal = digitalRead(BUTTONPIN); //Read pressed state of the button

  float motor_pos_f = (((float) currentEncoder_pos_f) / (4 * ((float) encoderCount))) * 3.14159 * motor_D; // Convert encoder counts to distance travelled
  if (motor_pos_f < 0) { // Ensure distance is always positive
    motor_pos_f = motor_pos_f*(-1.0);
  }

  Serial.print("Motor Position: ");
  Serial.print(motor_pos_f, 7);

  //Update spinning condition when button is pressed
  if (buttonVal == LOW && motorSpinning == false) {
    motorSpinning = true;
    currentEncoder_pos_f = 0;
    //Update motor direction
    if (motorDirection == "CCW") { //Spin motor CW
      digitalWrite(motorPin1, HIGH); //Spin CW
      digitalWrite(motorPin2, LOW); //Spin CW
      motorDirection = "CW";
    } else if (motorDirection == "CW") {    //Spin motor CCW
      digitalWrite(motorPin1, LOW); //Spin CCW
      digitalWrite(motorPin2, HIGH); //Spin CCW
      motorDirection = "CCW";
    }
  }
  
  Serial.print("\t Toggle: ");
  Serial.print(motorDirection);
  
  //Spin motor in corresponding direction
  if (motorSpinning == true) {
    if (motorSpeed < 50) { //Minimum PWM signal for starting motor
      motorSpeed = 50;
    }
    else if (motor_pos_f > curtainHeight) { //Stop motor when full distance is reached
      motorSpinning = false;
      motorSpeed = 0;
    }
    else if (motor_pos_f < thresholdDist) { //Ramp up speed at start
      motorSpeed = (255 / thresholdDist) * motor_pos_f;
    }
    else if (motor_pos_f > curtainHeight - thresholdDist) {
      motorSpeed = 255 - (255 / thresholdDist) * (motor_pos_f - (curtainHeight - thresholdDist)); //Ramp down speed at end
    }
    else {
      motorSpeed = 255;
    }
  }
  analogWrite(motorPWM, (int) motorSpeed); //Set motor speed equal to integer value of motor speed

  Serial.print("\t Motor Speed: ");
  Serial.println((int) motorSpeed);


  //Track position of motor using encoder (count number of revolutions and multiply by a distance)
  //Update speed according to position (If within a certain distance, slow speed down, else speed is HIGH)
}

void encoderA(){

  // look for a low-to-high on channel A
  if (digitalRead(encoderPinA) == HIGH) { 
    // check channel B to see which way encoder is turning
    if (digitalRead(encoderPinB) == LOW) {  
      
        currentEncoder_pos_f = currentEncoder_pos_f + 1;         // CW
    } 
    else {
        currentEncoder_pos_f = currentEncoder_pos_f - 1;        // CCW
    }
  }
  else   // must be a high-to-low edge on channel A                                       
  { 
    // check channel B to see which way encoder is turning  
    if (digitalRead(encoderPinB) == HIGH) {   
      
        currentEncoder_pos_f = currentEncoder_pos_f + 1;          // CW
    } 
    else {
        currentEncoder_pos_f = currentEncoder_pos_f - 1;          // CCW
    }
  }
}

void encoderB(){

  // look for a low-to-high on channel B
  if (digitalRead(encoderPinB) == HIGH) {   
   // check channel A to see which way encoder is turning
    if (digitalRead(encoderPinA) == HIGH) {  
       
        currentEncoder_pos_f = currentEncoder_pos_f + 1;         // CW
    } 
    else {
        currentEncoder_pos_f = currentEncoder_pos_f - 1;         // CCW
    }
  }
  // Look for a high-to-low on channel B
  else { 
    // check channel B to see which way encoder is turning  
    if (digitalRead(encoderPinA) == LOW) {   
        currentEncoder_pos_f = currentEncoder_pos_f + 1;          // CW
    } 
    else {
        currentEncoder_pos_f = currentEncoder_pos_f - 1;         // CCW
    }
  }
}

 

]]>
Button-Controlled Adjustable Platform by The Yews: Final Documentation https://courses.ideate.cmu.edu/60-223/s2021/work/button-controlled-device-holder-by-the-yews-final-documentation/ Sun, 09 May 2021 20:01:14 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13405 Introduction:

For the final project, we met with our client Haleigh Sommers, who lives life in a wheelchair and must place her common devices (phone, Kindle, laptop) on an attached tray in front of her. However, she often has trouble craning her neck down to interact with devices, and also feels uncomfortable using a small detachable stool that her mother can place for her; it is not very adjustable. Haleigh also emphasized that she would like to be able to adjust the position of her device herself, rather than having her mother constantly make adjustments.

In order to gauge the scope of this problem, we engaged in a comprehensive Zoom chat with Haleigh, not only getting a sense of her daily challenges but also getting to know her as a person. The three of us brainstormed solutions over a whiteboard while simultaneously receiving feedback from Haleigh, until we decided upon the idea for our device: an adjustable viewing platform. Through this interview, we built a clear picture of our client and the types of solutions she prefers.

 

What we built:

We built a contraption that can adjust the distance and orientation of Haleigh’s chosen device. Specifically, the contraption adjusts the distance from Haleigh through a sliding mechanism, moving back and forth along two rails on either side of her desk. The orientation of the device can also be adjusted. Both the sliding and tilting mechanisms are controlled through pressing and holding one of four total buttons (two for sliding, two for tilting).

 

Featured Image:

In Action:

Sliding mechanism:

 

Tilting mechanism:

 

Detail Images (4):

Button view: Red/White = Sliding, Black/White = Tilting

 

Internal view of circuitry

 

Top view of sliding mechanism: rotating screw

 

Comprehensive view of circuitry connecting to hardware

Narrative Sketch:

Haleigh is a busy student, and she needs to get some homework done before the deadline tomorrow. However, she finds it tough to see and interact with the small text on her phone and computer screen. She pressed a button to bring the computer holder towards her; now she can read the instructions of the homework assignment. After she submits the assignment, she hears a text notification on her phone, but it’s hard to read it without craning her neck. So she presses another button, which angles the surface, bringing the phone to a comfortable position so she can type out a response.

 

How we got here:

Prototypes:

Vishnu

Rotating mechanism

Questions we wanted answered:

  1. How much force can the device handle?
  2. What pieces would fit together to produce the rotating axle?
  3. How technically complicated would controlling degree of rotation be?

The first prototype was to test the rotating section of the device. Essentially, Haleigh can press one of two buttons, and the surface will rotate clockwise or counterclockwise.

To achieve this, we connected a DC motor to a potentiometer. The potentiometer would read the current angle, and also prevent the motor from turning past a certain angle (range: 0 to 70º). The platform is connected on a rod between the motor and the potentiometer.

Close-up of rotator (note placement of DC motor and potentiometer)

 

Scale Image of rotational portion

 

 

Sliding mechanism

Questions we wanted answered:

  1. Can the screw securely rotate, and will the nut rotate or stay put?
  2. How complicated is it, coding wise, to implement the forwards/backwards sliding?

We also used a DC motor for the sliding mechanism. The motor was connected to a long screw, which, when rotated, would cause a nut to slide forward and backward along the screw. The contraption was bolted onto this nut, so the entire thing could slide back and forth with just the press of a button.

Horizontal view of slider

 

Aerial view of slider (i.e. the actual sliding portion the breadboard was attached to)

 

Process Images:

First draft plan of slider and rotation mechanism, visualized

Mechanics of the slider (rectilinear motion of nut produced by rotation of screw). Src: http://507movements.com/mm_103.html

 

Slider mechanism in its early stages. To stabilize the screw, we thought we had to glue two screw holder together, but soon realized each holder could be secured with small screws.

 

Sketch of Initial Plan for rotation portion

 

The rotational mechanism in its early stages. I wanted to understand how a DC motor worked, how fast it could spin, and observe the change in the direction of the motor

 

For rotation, we also experimented with using a stepper motor, under the assumption that it was more powerful and had more torque; however, we scrapped this because it was hard to control the position of the motor.

Discussion:

By the time I started on these prototypes, we had a general sense of the mechanical design of the project. I was more concerned with how the motor would interact with the other parts of the project. While I discovered that the code itself was not very complicated, it was imperative that we had all the parts of the project available, so we could see how certain sections fit together. For the rotational portion, especially, I was able to answer the question of what electronic components (i.e. DC Motor, Potentiometer) I would need to implement the controlled angling. 

We also had to grapple with questions like how much force the device could handle. This was especially evident with the rotational component. While I was able to connect up the motor to the potentiometer, I did not realize how powerful the motor was, until it yanked the potentiometer out of its wiring. As a result, I had to be very careful when scrutinizing the allowable range of rotation, especially when we had constructed the full structure around the rotational component.

The sliding mechanism was easier, but we still had to ensure that the screw could securely rotate; if it was loose, the nut would not be able to move up down smoothly. This had less to do with the code, and more with making sure we tightly screwed in the rod holder. With my prototype, I was also able to do a quick demonstration of how the nut would stay in the same orientation while the screw turned, and would slide along the screw,  which meant we could attach any object to it that we wanted to slide (as evinced by the “sliding” red breadboard). Overall, my prototype certainly relied on a clear picture of the final product, as we had to make sure powerful components such as the DC motor did not clash with the more sensitive laser-cut wood structures in the project.

Sunjana and Julia

We were trying to answer these questions with our model. 

  1. What mechanical components would Haleigh like in her adjustable platform? Does she want to be able to slide the platform back and forth? Does she want to control the angle of the platform’s tilt? 
  2. Confirmation of what kind of devices would Haleigh be placing on the platform/their sizes?
  3. How would Haleigh like to control the movement of the device? Would she prefer pressing down on buttons, or would she like to use a joystick? 

Around the time we were meeting with Haleigh, we were aiming for our platform to be able to both slide back and forth as well as angle horizontally and vertically. However, we realized that mechanically implementing this would be extremely complicated, so we wanted to make sure that Haleigh actually wanted these capabilities before we spent weeks putting together the mechanisms for them. For our prototyping meeting with Haleigh, we decided to build a “feels-like” prototype with which we could manually simulate both sliding and angling, and then let Haleigh decide, over video call, which capabilities she wanted, if any. 

Below are images of our prototype. On the call, we manually angled the platform, which was attached to a pivoting pole through tape, which acted as a hinge. We also manually pushed the platform back and forth along some 20/80 aluminum extrusions. The pieces that slid through the extrusions were popsicle sticks attached to the platform. Haleigh then confirmed that she would prefer the platform to have both angling and sliding capabilities. As for how to control the platform, she emphasized that she would prefer to press down on pushbuttons, as she would have a hard time making a fist to grip a joystick, and she made it clear she would want to control both sliding and angling with pushbuttons. She also confirmed that she would want the buttons to be placed to the side of the device, and that the top of the platform should be about 14 to 15 inches off the desk when it is fully vertical. Finally, she told us that she would only be placing small items (i.e. Kindle, phone) on the platform, and she requested a ledge so that her items wouldn’t slip off. 

 

Vertical angling view of prototype platform

 

Axle the platform rotated on, we attached the two components with tape for the demo

 

Horizontal angling of platform/scale photo

We ended up using all of Haleigh’s recommendations in our final version of the project. We placed the buttons on Haleigh’s right hand side after later confirming she has little mobility in her left hand and mainly uses her right, and we designed the platform to angle and slide, to only be able to handle small devices, and to have a ledge. During the prototyping process, we were most surprised by how mechanically complicated implementing the motions was. The three of us have little to no experience with mechanical engineering, and thus we initially only took into account the difficulties of programming and designing the look of the device. It was during the prototyping phase that we learned that we needed rotating rods with sliding screws that didn’t rotate, as well as a means of attaching the platform to the two rods we needed, one for sliding and one for angling. We fleshed out the latter portion in the weeks after the prototype meeting. 

Process:

The Final Design of our Product

Process Photos

Sunjana and Vishnu working to put together electronics and physical components

 

Assembling the platform

 

Vishnu finalizing the electrical components

 

Cardboard prototype of the encased nut mechanism

 

Our chaotic work bench!

 

The laser-cut cardboard physical housing let us test that everything fit as intended before we cut more expensive materials.

Retrospective Process Reflection

In retrospect, the decision to pursue such a complex mechanical device dominated the course that this project took, essentially steering this assignment to take a more mechanically focused route rather than a computationally driven one. As the project developed, we chose to continue our pursuit of both angling and forward and backwards motion within our device. While in the end we technically got both of these functions working, it could have potentially been more robust and equipped to solve the problem if we had chosen to focus on just angling the platform. 

 

It is hard to say what a “mistake” is in this context, but I suppose allowing laser cutting to happen so late on in the process prevented us from making refinements to the physical form which led to the final model being a bit clunkier than what might be practical. That being said, one could argue that within the scope of this project, we only had time to make one physical iteration. Other more technical mistakes in this process included some motors being wired backwards, some wooden parts being glued in the wrong place, and the selected motor not being able to hold the final platform. The most resistance we faced in this process was a result of not having a concise and centrally located plan, so when problems arose, it took a lot of energy to figure out what exactly was wrong and what else that impacted.

 

Due to the mistakes mentioned, our group learned the importance of central diagrams and common reference materials that allow each member to work on the same page. We discovered that it was difficult to refer to particular components of our complex system since none of them were formally named. Moving forward, I think we will make efforts to prioritize system schematics and diagrams with concrete labels to allow for easier problem solving throughout the process. 

 

Another important discovery is an acute appreciation for mechanical engineers. We all learned to value the difficulty of designing functioning mechanisms that _efficiently_ result in an intended motion. Without a mechanical engineer on our team, we had a very hard time thinking about all of the issues of our mechanical solutions before we made them and had a looming fear that there were more efficient solutions. Ultimately, we learned that mechanisms are very, very hard to execute well. 

 

Scheduling

The process of managing this work collaboratively was a challenge. Vishnu was essentially in charge of the electronic and computational execution, Sunjana was the project manager and main mechanical engineer, and Jubbies spent most of her time on physical fabrication and engineering the mechanisms. Although we took the time to flesh out our Gantt chart and update it as we started out the process, it was not entirely representative of our final work timeline since several components took longer than anticipated. 

 

Our divergence from the schedule was in response to designing the system taking longer than planned and to unforeseen dependencies within the project development. For example, we realized that we couldn’t produce the final laser cut box design until the inner mechanisms and components were essentially finalized, which was not the case until near the end of the project. This meant that the box design and laser cutting happened later than initially planned which caused minor chaos. 

Conclusions and Lessons Learned:

Salient findings from the final critique:

The feedback we received tended to be favorable towards the physical layout of our project, the mechanical design, and the sliding motion of the platform. 

As for points of improvement, one major verbal critique we received was that our platform should use clamps to hold the item in place while it tilts. The critiquer, who was one of the clients, showed a similar device they had at home, which was a clamp that could grip around multiple objects. We found this to be useful because we hadn’t considered a case where the platform would be holding something thicker than the ledges designed to “hold” it. Additionally, we hadn’t considered cases where the item’s height was greater than the height of the platform, causing it to potentially slip off if the platform was suddenly angled. 

Another critique we found useful was that “different shapes of buttons would be nice to differentiate between functions, since a row of circular buttons can be confusing to memorize”. During the weeks of project building, we focused on choosing buttons that could be easily pressed by Haleigh, and were not too concerned with how Haleigh would be able to differentiate which buttons belonged to which move. However, if we were to expand this project further, we would design arrow buttons for “up” (sliding away from Haleigh), “down” (sliding towards Haleigh), as well as a long rectangle for “horizontal” (platform going horizontal), as well as a tall rectangle for “vertical” (platform going vertical). 

Furthermore, we agreed with the critique that our current product was too bulky, specifically that we “could likely make it smaller especially if it needs to fit on a desk”. Throughout the work period of this project, we definitely focused more on getting the mechanical components and the code to work together, as well as creating some design that could work with Haleigh’s requirements (i.e. having buttons on her right side, having it fit her tray). In the future, we could limit the range of sliding to a range which Haleigh is likelier to use (in order to reduce width of the setup from 13’’), and we would not make the box that slid along the 80/20 aluminum extrusions so thick since it didn’t hold anything. 

Finally, we found critiques concerning the exposed portions of the device. Specifically, these critiques were “It’s problematic to have a ‘tray’ with exposed moving parts. Stuff will fall into the tray and get things gunked up.”, and “Also having the exposed motor for the linear positioning system might be problematic.” Since we only put the final pieces of the device and connected the arduino/wires with the mechanisms on the last day, we left several parts of the device “exposed” so we could make last minute changes to the wiring, and we didn’t have time to create extra housing for the motor that rotated the “sliding” rod. In the future, we would definitely create housing for this piece so it isn’t exposed, and we would place the motor on the side that is opposite to the user. Additionally, we would seal the sides and the tops of the two boxes connected by the angling rod.

 

Thoughts on the experience of working remotely from our teammates as well as our client:

Admittedly, working on this project in a pandemic proved to be challenging. Given that the class is focused on tangible creation, which requires in-person attention and care, designing and implementing a project of large scope was no small feat. Compounding this challenge was the fact that we could not meet our client, Haleigh, in person. This proved to be a crucial hurdle to clear, as our project was very mechanical, as opposed to mostly electrical (as had been our previous projects). 

For instance, we had to interview Haleigh over Zoom, and in order to get a clear picture of her problem, we had her mother move the camera around to show us her tray, upon which we would place our device. However, we also had to note down Haleigh’s physical constraints, including how far she could reach her hands, and also the length and width of her desk. All these physical measurements would have been much easier to take had we been able to meet with Haleigh in person. 

In retrospect, this project was not well-suited to remote work. The only aspect that ran fairly smoothly was the coding portion; however, even that section required at least a partial physical mockup in order to see results, and that relied on meeting to laser cut and assemble the relevant pieces. Finally, we could have greatly streamlined communication in this remote environment. We have been communicating via iMessage, and at times not everyone was responsive with their updates. In the future, we need to enforce strict times when we send updates on our project progress, and do a better job of holding each other accountable if someone falls short.

 

Major takeaways from the experience of working with a person with a disability:

 

Technical Details:

Schematic and Block Diagram:

Block Diagram:

 

 

Schematic:

 

Code:

/*Final Project: Motorized Device Holder
 * Team Yews: Vishnu, Julia, Sunjana
 * 
 * Description: This code controls both the sliding and rotating mechanisms of the contraption.
 * 
 * The sliding mechanism is driven by a DC motor, which controls a rotating screw. 
 * As the screw rotates, it moves a plastic piece along, forwards or backwards.
 * 
 * The rotating mechanism is also driven by a DC motor, albeit attached to a potentiometer. 
 * The DC-potentiometer setup acts as a feedback loop; as the motor turns, it also controls the potentiometer
 * dial, which outputs a mapped value in software (0 to 360). When the value hits a certain threshold, 
 * the DC motor stops rotating, thus serving as an upper and lower bound on the range. The clockwise and 
 * counterclockwise rotation of the DC motor ultimately controls the device holder's orientation.
 * 
 * Pin Mapping:
 * 
 * Arduino Pin:     Type:           Description:
 * ---------------------------------------------
 * 10               output          motor A (rotational)
 * 8                output          motor B (rotational)
 * 12               output          motor driver (rotational)
 * 11               output          motor A (translational)
 * 9                output          motor B (translational)
 * 13               output          motor driver (translational)
 * A0               input           potentiometer
 * 7                output          pushbutton (forward)
 * 6                output          pushbutton (backward)
 * 5                output          pushbutton (clockwise)
 * 3                output          pushbutton (counterclockwise)
 */


//Set up pin mappings and initial variables

//Rotational motors
int motorAr = 10;
int motorBr = 8;
int motorDriveR = 12;

//Translational motors
int motorAs = 11;
int motorBs = 9;
int motorDriveS = 13;

//Potentiometer (for restricting rotation of platform)
int potPin = A0;

//Clockwise and counterclockwise buttons
int buttonPinAr = 7;
int buttonPinBr = 6;

//Forward and backward buttons
int buttonPinAs = 5;
int buttonPinBs = 3;

//Variables to track button presses (rotational, translational)
bool prevAr = false;
bool currAr = false;
bool prevBr = false;
bool currBr = false;

bool prevAs = false;
bool currAs = false;
bool prevBs = false;
bool currBs = false;



void setup() {

  //Set up initial pin modes
  pinMode(motorAr, OUTPUT);
  pinMode(motorBr, OUTPUT);
  pinMode(motorDriveR, OUTPUT);
  pinMode(potPin, INPUT);

  pinMode(motorAs, OUTPUT);
  pinMode(motorBs, OUTPUT);
  pinMode(motorDriveS, OUTPUT);

  //Set up drivers
  digitalWrite(motorDriveR, HIGH);
  digitalWrite(motorDriveS, HIGH);
  Serial.begin(9600); //print feedback

}


void loop() {
  int tmp_potVal = analogRead(potPin);
  
  //Restrict potentiometer value between 0 and 360 (a circle)
  int potVal = map(tmp_potVal, 0, 1023, 0, 360); 

  int buttonValAs = digitalRead(buttonPinAs);
  int buttonValBs = digitalRead(buttonPinBs);

  int buttonValAr = digitalRead(buttonPinAr);
  int buttonValBr = digitalRead(buttonPinBr);

//If button Ar is pressed, tilt motor into "vertical" position
  if (buttonValAr == HIGH and buttonValBr == LOW) {
    if (prevAr == false) {
      currAr = true;
    }
    //Upper limit of potentiometer = 70º
    if (potVal < 70) {
      digitalWrite(motorAr, HIGH);
      digitalWrite(motorBr, LOW);
    }
    //Once motor reaches 70º, stop
    else if (potVal == 70) {
      digitalWrite(motorAr, LOW);
      digitalWrite(motorBr, LOW);
    }
  }

//If button Br is pressed, tilt motor into "horizontal" position
  if (buttonValBr == HIGH and buttonValAr == LOW) {
    if (prevBr == false) {
      currBr = true;
    }
    //Lower limit of potentiometer = 0º
    if (potVal > 0) {
      digitalWrite(motorAr, LOW);
      digitalWrite(motorBr, HIGH);
    }
    //Once motor reaches 0º, stop
    else if (potVal == 0) {
      digitalWrite(motorAr, LOW);
      digitalWrite(motorBr, LOW);
    }
  }

  //If neither rotational button is pressed, motor is stopped
  if (buttonValAr == LOW and buttonValBr == LOW) {
    if (prevAr == true) {
      currAr = false;
    }
    digitalWrite(motorAr, LOW);
    digitalWrite(motorBr, LOW);
  }

  //If button As is pressed, contraption moves forward
  if (buttonValAs == HIGH and buttonValBs == LOW) {
    if (prevAs == false) {
      currAs = true;
    }
    digitalWrite(motorAs, LOW);
    digitalWrite(motorBs, HIGH);
  }

  //If button Bs is pressed, contraption moves backward
  if (buttonValBs == HIGH and buttonValAs == LOW) {
    if (prevBs == false) {
      currBs = true;
    }
    digitalWrite(motorAs, HIGH);
    digitalWrite(motorBs, LOW);
  }

  //If neither translational button is pressed, the contraption does not slide
  if (buttonValAs == LOW and buttonValBs == LOW) {
    if (prevAs == true) {
      currAs = false;
    }
    digitalWrite(motorAs, LOW);
    digitalWrite(motorBs, LOW);
  }


  //Update variables before next iteration of loop

  prevAr = currAr;
  prevBr = currBr;

  prevAs = currAs;
  prevBs = currBs;

}

 

 

]]>
Hydration Reminder https://courses.ideate.cmu.edu/60-223/s2021/work/hydration-reminder/ Thu, 15 Apr 2021 04:12:10 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13323  

Purpose: An attachable light to a water bottle that reminds you to drink water

Ring light (neo pixel) is attached to a Velcro belt and lights up four colors: green, yellow, red, and blue

 

Video of the water bottle initially starting in green state, for the video purpose I changed the time limit to alert you to drink water from 15min to 10 seconds. Then the ring led blinks yellow until a user picks it up and then a blue animation starts until a user sets the bottle back down, then the color changes back to green because you’ve taken a drink of water. 

  ***near the end of the video I said “if you go longer than 10 seconds without drinking water red light flashes”, I meant longer than 20 seconds (twice the amount of time for yellow light) before the red light flashes*** 

Process Images and Review

Throughout the process, one significant choice I made was to use a neo pixel ring instead of soldering a bunch of LEDs in a circle to a protoboard and then using a saw to cut the corners of the protoboard. I got this idea from looking at all the parts in my kit, and seeing a circle ring of LEDs which wasn’t a part in the Arduino kit, but something I could probably learn more about and test it in a reasonable amount of time. This decision definitely saved me a lot of pain and time, and allowed me to learn how to use a neo pixel which was easier than it sounds. To attach it to the belt I initially bent paper clips to make two different slots, painted them black, and hot glued it to the back of the ring. Since the paper clips I was using are made of metal it caused some noise disturbance and potentially unwanted connections which caused to me to end up attaching the back of the ring to the belt with cut up Command Strips which also worked.

Left photo is a rough idea of how I would’ve implemented the ring LED without the neo pixel, and the left is me using the neo pixel and playing around with the different colors and intensities I could program it to.

These are process images for the paper clip attachment I designed for the neo pixel. First I took a paper clip, bent it to the right dimensions, and painted it black. Next I hot glued both clips to either ends of the neo pixel and waited for the glue to cool. This allowed me to slip the neo pixel on and off the Velcro belt which is attached on the water bottle

After deciding the paper clip attachments weren’t the best idea, I took a command strip and cut a piece of it to attach to the water bottle itself and the back of the neo pixel. Then I attached both ends and the neo pixel stuck in place

Another significant choice I made was to hard code all the water capacity angles into my code. This meant I had to gather some data on how far I would have to tilt my water bottle to get it at 90% capacity, 80% capacity, 70% capacity, etc within the dimensions of my water bottle. I found that the bounds between each of the water capacities (especially around 50%-60%) were very similar and often overlapped, which was a bit of a surprise but something I could manage in my code. It took around an hour to collect this data and two minutes to “hard code” so it wasn’t too bad.

The left picture is my “hard code” function that finds capacity level and to the right is the data I gathered

The last significant choice I made was to add a blue light animation that would begin once the water bottle was picked up and end once the bottle was back on a flat surface. I speak more about where this decision came from and the ultimate outcome of this idea in my discussion section of documentation.

Video of blue light animation

Video of water bottle moving with blue light animation playing 

 

Discussion

Before the project was finalized, I had the opportunity to come into class and show my prototype to my classmates to get some constructive feedback on the model. Feedback from classmate (A) was “you know what you could add? An animation with the neo pixel ring that starts as your lifting the bottle and drinking out of it.” Classmate (B) stated “you could de-solder the accelerometer and have it flush with water bottle to make the design cleaner.” I was motivated by both responses and added a function to my code which begins an animation with blue lights once a user picks up the bottle and ends when a bottle is put down on a flat surface. I attempted implementing the second response, but I found it extremely hard to de-solder the accelerometer while taking out pins because the solder cooled as soon as I melted it and there wasn’t much free room between the pins to move my soldering iron around which ultimately resulted in me abandoning this idea.

Abandoning the latter idea wasn’t all too bad, I just hid the “bulky” accelerometer by placing it to the opposite of the neo pixel ring on the belt strap. Overall, this managed to do the trick but one aspect of the project I was still unsatisfied with was how visible the wires were in my project. I couldn’t think of a way around this because the object was made to move. In another iteration of the project I probably would’ve taken the time to learn more about the mini-Arduino since my electrical schematic only involved the Accelerometer, and neo pixel ring (3 analog pins, 1 digital pin, power, and ground). The slimness of the mini-Arduino would’ve allowed me to store the wires in a nice box that I could attach onto the belt strap.

Aside from the visible wires I’m happy that I got to learn how to use a neo pixel ring, how to write functions in Arduino, and got to a semi-functioning project. It’s buggy due to my code, I found that coding if the water bottle was picked up and the angle was constantly changing was harder than I initially thought. In my code, I implemented a function to sense if the angle was changing and it always evaluated to true even when the bottle was on a flat surface for some time. I used this function to control the blue light animation in order to demonstrate that the user was drinking water, and since the function always evaluated to true, the animation was always playing. This was definitely not intentional and I got hung up on trying to fix it with no avail. On the next iteration, finding out how to fix this would probably be the priority because the animation was one feature I really liked about the project and would like to keep it for the future.

 

Technical information

/*  Title: HYDRATION REMINDER 
 *  Owner: Amelia Lopez
 *  60223, project 2
 *  
 *  Description: Code that alerts a user when to drink water 
 *  through the color of a neo pixel (inported library on line #)
 *  green corresponds to a healthy level of water intake every 15 min
 *  yellow reminds a user to drink water based on water capacity for the past 15 min
 *  red alerts a user they haven't been drinking water for more than 30 minutes.
 *  Water capacity is measured through the accelerometer which measures angle. 
 *  The steeper the angle, the less water is in the water bottle. 
 * 
 *  Borrowed heavily from: 
 *  https://how2electronics.com/arduino-tilt-angle-distance-meter/
 *  in order to find angle from accelerometer
 * 
  Pin mapping:

   Arduino pin | type   | description
   ------------|--------|-------------
   A1            input     X-pin Accelerometer
   A2            input     Y-pin Accelerometer
   A3            input     Z-pin Accelerometer
   ~11           output    neo pixel ring (LED)
  (digital PWM~)

*/
#include <PololuLedStrip.h>

// Create an ledStrip object and specify the pin it will use.
PololuLedStrip<11> ledStrip;

// Create a buffer for holding the colors (3 bytes per color).
#define LED_COUNT 12
rgb_color colors[LED_COUNT];
//YELLOW= 25, 25, 0
//GREEN= 0, 20, 0
//RED = 35, 0, 0
//blue = 0, 14, 34

//DECLARING ARDUINO PINS
const int ACCE_X = A1;
const int ACCE_Y = A2;
const int ACCE_Z = A3;

const int RED_PIN = 11;
const int GREEN_PIN = 10;
const int BLUE_PIN = 9;

//declaring variables useful for angle measurement
#define ADC_ref 5 // ADC reference Voltage
#define zero_x 1.799
#define zero_y 1.799
#define zero_z 1.799
#define echoPin 8
#define trigPin 9
#define selectSwitch 1
#define sensitivity_x 0.4
#define sensitivity_y 0.4
#define sensitivity_z 0.4
unsigned int value_x;
unsigned int value_y;
unsigned int value_z;
float xv;
float yv;
float zv;
float currAngle;
float angleY;
float angleZ;
float min_angle;
String colorName;

//DECLARING INTS FOR LCD WAITING TIMES
const int WAIT_TIME = 1000;
unsigned long timeVariable = 0;

//DECLARING INTS FOR WATER LEVEL
const int WATER_WAIT_TIME = 15000;
const int TOLERANCE = 2;
const int angleFlat = 230;
int water_capacity = 100; // unit: %
int prev_capacity = 100; //unit: %
unsigned long waterTimeVariable = 0;
unsigned long yellowTimer = 0;
float lastMinAngle;
bool isDrinkining = false;
bool yellowLightState = LOW;

void setup() {
  analogReference(ADC_ref);

  //INPUT PINS
  pinMode(ACCE_X, INPUT);
  pinMode(ACCE_Y, INPUT);
  pinMode(ACCE_Z, INPUT);
}

void loop() {
  setGreen();
  //set color to yellow to remind user to drink
  if (millis() - waterTimeVariable > WATER_WAIT_TIME) {
    //SET COLOR TO BLINKING YELLOW UNTIL ANGLE CHANGES
    prev_capacity = calculateCapacity(prev_capacity, calculateAngle());
    currAngle = calculateAngle();
    while (angleFlat - TOLERANCE <= currAngle)//blink yellow; reminder to drink
    {
      setYellow();
      yellowTimer = millis();
      yellowLightState == HIGH;
      currAngle = calculateAngle();
      delay(500);
      setOff();
      delay(500); 
      
      if (millis() - yellowTimer >= 5000 and yellowLightState == HIGH) {
        yellowLightState = LOW;
        setOff();
        yellowTimer = millis();
      }
      currAngle = calculateAngle();
      if (millis() - yellowTimer >= 5000 and yellowLightState == LOW) {
        yellowLightState = HIGH;
        setYellow();
        yellowTimer = millis();
      }
      
      currAngle = calculateAngle();
      if (millis() - waterTimeVariable > 2 * WATER_WAIT_TIME) //set LED red; missed a drinking cycle
      {
        setRed();
      }
    }
    while (angleIsChanging()) {
      blueAnimation();
      water_capacity = calculateCapacity(prev_capacity, calculateAngle());
      angleIsChanging();
    }
    setGreen();
    
    //check that the new capacity is lower than the previous. If not, light continues to blink yellow
    while (water_capacity == prev_capacity) {
      setYellow();
      yellowTimer = millis();
      yellowLightState == HIGH;
      currAngle = calculateAngle();
      if (millis() - yellowTimer >= 500 and yellowLightState == HIGH) {
        yellowLightState = LOW;
        setOff();
        yellowTimer = millis();
      }
      water_capacity = calculateCapacity(prev_capacity, calculateAngle());
      if (millis() - yellowTimer >= 500 and yellowLightState == LOW) {
        yellowLightState = HIGH;
        setYellow();
        yellowTimer = millis();
      }
      water_capacity = calculateCapacity(prev_capacity, calculateAngle());
    }
    waterTimeVariable = millis();
    
  }

  //taken a sip
  setGreen();

  //while angle is changing, display the blue animation
  while (angleIsChanging()) {
    blueAnimation();
    water_capacity = calculateCapacity(prev_capacity, calculateAngle());
    angleIsChanging();
  }
  timeVariable = millis();
}


int calculateAngle() {
  //Taking in Accelerometer readings and converting it into an angle
  value_x = analogRead(ACCE_X);
  value_y = analogRead(ACCE_Y);
  value_z = analogRead(ACCE_Z);
  xv = (value_x / 1024.0 * ADC_ref - zero_x) / sensitivity_x;
  yv = (value_y / 1024.0 * ADC_ref - zero_y) / sensitivity_y;
  zv = (value_z / 1024.0 * ADC_ref - zero_z) / sensitivity_z;
  currAngle = atan2(-yv, -zv) * 57.2957795 + 180; //x angle
  angleY = atan2(-xv, -zv) * 57.2957795 + 180; //y angle
  angleZ = atan2(-yv, -xv) * 57.2957795 + 180; //z angle
  return currAngle;
}

int calculateCapacity(int prev_capacity, float min_angle) {
  if (247 < min_angle) return 100;
  if (225 < min_angle) return 90;
  if (208 < min_angle) return 80;
  if (204 < min_angle) return 70;
  if (203 < min_angle) return 60;
  if (200 < min_angle && prev_capacity == 60) return 50;
  if (200 < min_angle) return 40;
  if (198 < min_angle) return 30;
  if (196 < min_angle) return 20;
  if (193 < min_angle) return 10;
  return 0;
}

//set all pixels to yellow
void setYellow() {
  for (uint16_t i = 0; i < LED_COUNT; i++)
  {
    colors[i] = rgb_color(25, 25, 0); //yel
  }
  ledStrip.write(colors, LED_COUNT);
}

//set all pixels off
void setOff() {
  for (uint16_t i = 0; i < LED_COUNT; i++)
  {
    colors[i] = rgb_color(0, 0, 0); //yel
  }
  ledStrip.write(colors, LED_COUNT);
}

//set all pixels to red
void setRed() {
  for (uint16_t i = 0; i < LED_COUNT; i++)
  {
    colors[i] = rgb_color(35, 0, 0);
  }
  ledStrip.write(colors, LED_COUNT);
}

//set all pixels to green
void setGreen() {
  for (uint16_t i = 0; i < LED_COUNT; i++)
  {
    colors[i] = rgb_color(0, 20, 0);
  }
  ledStrip.write(colors, LED_COUNT);
}

//cascade of blue lights turning on
void animationUp() {
  const int startingPoint = 4;
  for (int i = 0; i < (LED_COUNT / 2) + 1; i++)
  {
    colors[(startingPoint - i) % LED_COUNT] = rgb_color(0, 14, 34);
    colors[startingPoint + i] = rgb_color(0, 14, 34);
    if ((startingPoint - i) % LED_COUNT == -1) {
      colors[LED_COUNT - 1] = rgb_color(0, 14, 34);
    }
    ledStrip.write(colors, LED_COUNT);
    delay(150);
  }
}

//cascade of blue lights turning off
void animationDown() {
  const int startingPoint = 4;
  for (int i = LED_COUNT / 2; i >= 0; i--)
  {
    colors[(startingPoint - i) % LED_COUNT] = rgb_color(0, 0, 0);
    colors[startingPoint + i] = rgb_color(0, 0, 0);
    if ((startingPoint - i) % LED_COUNT == -1) {
      colors[LED_COUNT - 1] = rgb_color(0, 0, 0);
    }
    ledStrip.write(colors, LED_COUNT);
    delay(150);
  }
}

//determine if angle is changing 
bool angleIsChanging() {
  return angleFlat - TOLERANCE > calculateAngle();
  
}

//starts the blue animation 
void blueAnimation() {
  animationUp();
  delay(100); //maybe 50ms instead
  animationDown();
  delay(100);
}

//blinks yellow light 
void blinkYellow(){
  setYellow();
  delay(200);
  setOff();
  delay(200); 
}

]]>
Interview documentation https://courses.ideate.cmu.edu/60-223/s2021/work/interview-documentation/ Wed, 14 Apr 2021 13:08:41 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=12848 Introduction

We are a student team working to design, prototype, and build a helpful assistive device for a client with limited mobility. Our client’s name is Elaine. Our design team consists of Dominique Aruede, Jud Kyle, and Eric Zhao. We plan for this to be about a six-week iterative process, including Elaine’s input every step of the way. Here, we cover the details of our first interview with Elaine from 04-06-21 where we introduce ourselves, learn more about her daily activities, and what her daily frustrations look like.

Agenda

The following was our planned course of discussion for the interview:

Intros

  • Acknowledge circumstances and ask if anyone has anything they want to talk about before getting started (help warm everyone up and prepare for the interview)
  • Name, background, fun fact
    • Elaine
    • Eric
    • Jud
    • Dom
  • Ask for permission to record the interview

Explanation

  • We are trying to prototype a useful device for you using your feedback to iterate
  • We are not professional technologists, but we do want to get as close as possible to something you can use long term that is truly helpful
  • Make sure to ask if they have any questions about the process

Questions

  1. What are some of the daily activities you enjoy?
    1. Why do you enjoy them?
  2. Is there a task or activity you like/have to do on a daily basis that is difficult or frustrating to carry out?/What is something that frustrates you when it comes to enjoying your daily activities?
    1. Is there anything that has become harder to do over time that you used to enjoy doing?
    2. Try to get to a story that they have about a particular task or daily activity if possible (Either by letting it happen naturally or just by asking)
  3. If possible, could you physically demonstrate a task you normally do?
  4. Would it be possible for you to draw your daily routine/life on a piece of paper or narrate it so that we can draw it?/Can you draw out your daily life from waking up to sleep, or you can describe it and we’ll draw.
    1. Trying to get a better understanding of things throughout the day that we could design something for
    2. Again try to get stories from Elaine
  5. Let’s brainstorm convenience gadgets (i.e. I would love a device that tells me when my water boils).
    1. Try to narrow down to a few ideas that could be pursued

Conclusion

Summary & Takeaways

The entire meeting was very fruitful, but the main takeaway was that Elaine views her service dog, Oak, as more dependable than human assistants, and therefore would love to incorporate Oak into the device in some way. We thought of several devices that that might look like. A dog-activated door opener, a dog-activated RFID card scanner, a dog-activated lifeAlert system (granted, this idea is a bit questionable for safety reasons), and finally the idea addressing what seemed to be one of Elaine’s biggest concerns, a dog-activated blinds operator; Elaine expressed that when someone leaves the blinds drawn and the room is too sunny, she can’t do anything about it on her own because she would have to pull a heavy chain which her range of motion and dexterity does not permit. So, inspired by Elaine’s suggestion that “when in doubt, think of how a squirrel might solve your issue” (only in this case we swapped out squirrels for dogs), we quite like the idea of a dog-activated blinds operator.

Team Thoughts

The meeting went a bit off the rails in terms of sticking to our actual agenda, but we got lots of pertinent information from Elaine’s comments. We definitely realized after the fact that there were things we should have clarified like what exactly do her blinds look like. A picture would have been great to figure out specifics like is it floor-to-ceiling? Is it slats that rotate or more like a canvas that rolls up? Next time, I think we’ll focus on sticking more with the agenda if the discussions starts to become derivative, although that wasn’t too much of a problem for this interview. We also would like to ask for images, videos, and other specifics if we can get them.

Talking through our ideas the next day, some of them started to make less sense for the scope of this project. For example, the lifeAlert system posed both technical and moral problems in the sense that it would be a lifeline device built by three inexperienced students, and an emergency-grade wireless connection would be difficult for use to implement. It was great to have the blinds idea stick out as the most doable assistive device, though. We are still open to pivot points, but leaning strongly toward that one.

]]>
Firs (Team Amy) https://courses.ideate.cmu.edu/60-223/s2021/work/firs-team-amy/ Wed, 14 Apr 2021 07:55:39 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=12922 Daniel, Nish, and Kevin are students at Carnegie Mellon University working on a project to build an assistive device for Amy. Amy has a spinal cord injury which leaves her mostly paralyzed from the shoulders down and depends on assistive devices such as her trackball, hospital bed, and wheelchair in her daily life. The purpose of this meeting was to introduce our team to Amy and become more familiar with her specific needs as well as brainstorm devices and functionality that could assist her in daily life.

Meeting Agenda

We discussed our meeting plan during class the day before our interview. The meeting outline agreed on is below. We thought it would be important to try to understand what Amy’s life looks like and how her disability affects her before she gave any ideas for the project so we could ideate while learning about her better.

Introduction

  • Introduce ourselves, allow Amy to introduce herself to us (5-7 minutes)
  • Ask her abut hobbies/work (5 minutes)
  • Why was she interested in working with this class again (2 minutes)

Explain scope of project

  • Limited in time/mechanical ability (3 minutes)

Ask about her life/needs (Remainder of time)

  • What her disability is and how it limits her
  • Ask her to run through her daily life/tasks she struggles with
    • Take note of where utility is lacking vs. could be improved
  • Assistive devices/tricks she currently uses
    • How can we model our solution after familiar solutions?
  • Ask her for any issues she came into the project wanting addressed
    • Ask follow-up questions to clarify any of her ideas
  • Discuss our ideas for devices thought of during the meeting
    • For each device, is it addressing something important?
    • Would it make another aspect more difficult? 

Meeting Summary and Takeaways

Unfortunately our meeting recording had an error so we do not have media from the meeting itself. 

To start the meeting, we introduced ourselves and asked Amy what she does for hobbies and work. She is interested in reading, shopping, decorating, writing, watching tv, and all sorts of art forms. She said that in the past she painted with a mouthstick. She previously worked as a stylist for a fashion/shopping website, but had a bad pressure sore that forced her into bedrest over the past 2 years. We found out that Amy has a spinal cord injury that left her paralyzed from the shoulders down, but still has a small amount of motor control in her hands/arms that allows her to use her computer and perform some daily tasks. Pressure sores are common in people with spinal cord injuries like Amy as they are unable to feel anything in certain areas of their body and struggle to shift their bodyweight to prevent them. Amy’s wheelchair can tilt to shift her weight in order to help with her recovery and prevent future pressure sores. One of our initial ideas was to make a device to remind Amy to shift her weight, but we decided against pursuing it after hearing more about the assistive devices she currently uses. After finding out about her hobbies and a small amount about her spinal cord injury, we asked her to run us through her daily life. She cannot get up very fast and needs an attendant to help her out of bed using a lift device such as the one shown below.

Example bed lift from https://101mobility.com/products/patient-lifts/portable-patient-lifts/. This is similar to what Amy showed us on the Zoom call. 

 Every day she needs to complete range of motion exercises which help to stretch and strengthen her muscles to prevent them from contracting into the fetal position due to a lack of use. Her attendant leaves around 2, but throughout the day she likes to read, watch TV,  go on Facebook,  possibly exercise again, run errands, or go on a walk. She used to use a standing aid to assist her exercising before bed, but has been unable to recently due to her pressure sore. After speaking about her routine, we asked about devices and tricks she uses. The two standout things were how she interacts with her computer and the environmental control unit she had in her previous condo. She interacts with her computer by using a trackball that she is able to control with the limited mobility in her hands as well as voice recognition software as a replacement for typing.

Trackball similar to the one Amy uses https://www.amazon.com/Kensington-Expert-Wireless-Trackball-K72359WW/dp/B01936N73I

The trackball was the most important find of the meeting to us, as it gives us an idea of how we could make a device that Amy will be able to interact with by basing our input design off of her current device. The environmental control unit in her old condo was an older device, but she informed us that current models are able to control lights, thermostats, the TV, and certain types of hospital beds like the one she has. From this discussion, we found that she currently needs someone else to control the TV for her as well as adjust the angle of her bed. We saw two distinct ideas to help her here, and based two of our ideas for the ideation portion of this project on these topics.

Having already been a part of this project last semester, Amy also came with an idea of an assistive device she would like. She likes to sleep with the blanket completely over her head and chest, but has difficult moving the blanket up and down since her arms have limited mobility and her hands are in an essentially fixed position near her hips. She wanted a device that would help move the blanket up and down by about a foot in each direction. While we obviously want to try and fulfill her wishes, we had some safety concerns about the project, so we tried to gather as much information as we could about how she slept and the typical blankets that she uses, particularly about the weight and material. One interesting thing to note is that when we asked if her bed had railings, Amy commented that she hates them but has one on the left side, but would be willing to add in the right railing if that meant she could move the blanket up and down. Although it seems like an important need for Amy, it may be outside the scope of our project at this current moment. It would be interesting, however, to design a device that could address the safety concerns and execute moving a soft material, even if it’s not for this project. 

 

Amy’s main type of blanket that she showed on the Zoom call. This would be hard to attach a device too to move in the first place, and is also a heavy blanket, which adds to the difficulty considering our current tools. She also stated she uses a duvet sometimes, which might have been easier to work with if we proceeded with this idea.

Amy also mentioned the “Environmental Control Unit.” Essentially, this was a device that enabled her to switch her lights on and off, control the temperature, turn on/off other devices around the house from a central system. However, since moving back to her parent’s home, there was no Unit that had been hooked up to the house, and she was lacking in many of the functionalities that were previously offered to her. There is a wide range of possibilities that we could address through a physical device when looking at what the Environmental Control Unit used to allow her to do. 

In another aspect, Amy also talked vividly about her past hobbies and things she wishes to go back to. For example, she used to work at StitchFix and enjoyed crafts, and wants to get back into writing. When asking about the tools she uses to transcribe words down and partake in these hobbies, we understood that there are applications she uses but they aren’t extremely compatible with surfing the web. To address this issues would require more of a software solution, but there could potentially be an assistive device in this realm, as well. However, our group decided to move forward with finding an assistive device for her daily life rather than her hobbies. 

 

Our notes from asking about Amy’s daily routine. Her attendant helps her with exercises since she doesn’t have the strength to do them all herself, but keeping her muscles mobile aids in keeping them flexible enough to move when getting dressed or lifted. l

 

Our Thoughts on the Meeting

  • Reflecting on interview as a team, we consolidated the ideas we had discussed with Amy into three ideas we thought were the most viable:
    • The Blanket Adjustment System: This was the idea that Amy seemed to have the greatest need for the interview. However, we also recognized that although it could be very helpful, it would also be mechanically hard to implement and pose a major safety hazard. In our discussions of different methods to implement this idea we had to balance the factors of efficacy versus safety. While we came up with a few ideas for an emergency stop function (an analog button or voice activation), we ultimately decided there was no guarantee that either method could be truly foolproof. If we were to implement a solution, we would have to put in multiple types of emergency stops, and acknowledge that having the emergency stop go off too often is better than having it not go off at all. However, in a short project with relatively inexperienced students, there’s always a possibility that it malfunctions in a time of need. In a problem where a mistake in design or implementation could mean loss of life, the acceptable risk might be too high. However, at the same time, the benefit to Amy’s comfort, if implemented correctly, would be a huge boost to her quality of life.
    • A Environmental Control Unit (TV): The second idea we discussed was a controller for Amy’s TV that was optimized for her specific needs and use. Using her trackball mouse as a starting point, we started brainstorming different ways of providing a controller that was both easy to use and portable. Building off of the trackball idea, one idea we talked about was a joystick like or scroll wheel based mechanism. The main idea was to find a medium that required less fine-motor skill but also a high level of control as not being able to accurately control volume or channel designations can prove to be frustrating. Sketching out both the joystick and scroll wheel ideas, we also explored making the form more ergonomic to fit her use, whether that means playing with the grip of the joystick or providing a wrist rest that takes the burden off of her arms and shoulders. Further progress for this idea would require working closely with Amy to find out what the limits of her mobility and comfort entail and how we could make something she finds easy to move and use.
    • A Device Connected to the Lift on her bed: The final idea we had was to create a method of interfacing with the lift in her bed. Currently the bed is able to raise her head and upper body up and down, however, Amy is unable to interface with this system due to her limited mobility. The primary challenge could come from creating a system that she is able to control to a granular level that would interface with the existing system. The primary idea we discussed was potentially including a voice control element that allowed the interaction to be hand free and easier to use. A important factor we had to consider, however, was the ability for the original lift to still be operational through the existing interface without our new system preventing Amy’s helpers from accessing it.
  • Other thoughts: 
    • As stated previously, our team seemed to collectively agree on addressing aspects of the Environmental Control Unit Amy mentioned, rather than going through the hobbies route. Though we never really discussed it, this route requires less semantics and more thought to address Amy’s specific restrictions. Initially, one group member though that certain ideas such as a remote to turn the TV on and off might be too simple, but after we started the ideation process, the physical restrictions imposed on the design proved to be significant and pose an interesting challenge to overcome. 
    • Another one of Amy’s problems is that in order to even use her laptop, her attendant must help Amy turn on her headphones, set the trackball at the right location for Amy to easily use, and set the laptop in front of Amy, among other intermediate steps. We considered making a device to allow Amy to set up her laptop by herself, but physically moving items (especially expensive, heavy ones) would prove difficult, and each step of the process to open her laptop would essentially require its own device. Just thinking about how to execute this one simply act that we do mindlessly made us further understand Amy’s limitations, and emphasizes the importance/utility of making something that may seem simple initially, such as a control for the lights or an adapted TV remote. 

]]>
The Maples Interview Documentation https://courses.ideate.cmu.edu/60-223/s2021/work/the-maples-interview-documentation/ Wed, 14 Apr 2021 05:25:46 +0000 https://courses.ideate.cmu.edu/60-223/s2021/work/?p=13282 Introduction

The goal for this project is to create an assistive device for a person living with a disability that can hopefully make their daily life easier. James, Nicole, and Shuyu (The Maples!), interviewed Jen on April 4rd, 2020, in order to gauge a task they could improve for her. Jen has limited use of her hands and arms, which can make certain tasks like putting on makeup extremely difficult. However, Jen can still use her mouth to perform certain tasks. After conducting an hour-long interview, The Maples set out to create a device that could improve an aspect of Jen’s daily life. 

Meeting Agenda

  • Ice Breaker:
    • How are you doing? (If you are comfortable sharing)
    • Introductions (names, brief introduction)
      • Us → Personal value, interest, or relevant factoid to make personal connection to client and group
        • What you made for project 2
        • How long have you lived in Pittsburgh and what do you like about living here?
      • Client → interest in participating in this activity
        • Why they wanted to participate
        • Interests in the project
  • Clarification Expectation and Project Goals:
    • What we are doing
      • Making prototype devices
      • Solving real problems with relatively complex solutions
      • Participating in the design and documentation process over 6 weeks
    • What we aren’t trying to do
      • Create polished final product
      • Making commercially available products
      • Discover something revolutionary in the world of assistive devices
    • Schedule
      • April 21 prototype critique
      • May 5 final presentation
  • Understanding Needs of Client:
    • Are there any daily activities that you have a hard time performing or would like to be able to perform more easily?
      • Could you demonstrate one or multiple of these tasks? (completely fine if not but it would help us get a better understanding of possible solutions)
    • Is there anything that you maybe used to enjoy doing but cannot anymore?
    • What does your daily routine look like?
      • Could you demonstrate? If not, could you maybe draw a cartoon of it?
    • Why is this solution important to you?
  • Conclusion:
    • Thank them for their time
    • Get client contact information
    • Iterate schedule of process again just to be clear

Meeting Summary and Major Takeaways

Sketch made during the interview as Jenn described what kind of device she needed.

Over the course of the meeting we discussed different types of actions that Jen would like to be able to do better. Specifically, we talked about her difficulty with activities like food prep and self-care. From both of these topics, we learned that Jen has the most trouble with actions meant for hands. She currently has no real method of applying things like make-up to her face without a fair amount of effort, if at all. In addition, her current method for cutting vegetables and other foods involves holding a knife in her mouth and leaning over to cut the vegetables with head motions. While she is able to provide enough force to cut through most vegetables, she cannot cut and hold the vegetable at the same time. Despite the relative success that Jen might have at modifying tasks meant for hands, she expressed much interest in a device that could make these modifications much more efficient. From the conversation topics, it was also clear that helping with food prep is very important to her as it is the only part of the cooking process in which she can safely be involved.

Our Thoughts

The meeting went pretty smoothly. We got along well, and our discussions got pretty deep. When the discussions stagnated, we used the agenda more like a reference, effectively keeping the conversation going. Something that we should’ve delved further in during our conversation was discussing the details of Jenn’s range of motion. We had assumed after our discussion that she could use at least one hand, until we watched the video she sent, which showed that she instead used her mouth to do tasks. We wish we asked more questions about how much dexterity she has with maneuvering things with her mouth.

]]>