Final documentation – Intro to Physical Computing: Student Work Fall 2020 https://courses.ideate.cmu.edu/60-223/f2020/work Intro to Physical Computing: Student Work Wed, 23 Dec 2020 16:25:02 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.15 Art Enabler by Team Amy: Final Project Documentation https://courses.ideate.cmu.edu/60-223/f2020/work/art-enabler-by-team-amy-final-project-documentation/ Fri, 18 Dec 2020 20:48:25 +0000 https://courses.ideate.cmu.edu/60-223/f2020/work/?p=12060 INTRODUCTION

Prototype WordPress site

For the final project in 60-223: Introduction to Physical Computing, each group was tasked with developing a project to assist the life of a client living with a disability. Our client, Amy Shannon, is severely paralyzed in numerous parts of her body including her hands. An avid creative at heart, Amy expressed to us during our interviews that she would love to be able to participate in creative activities, including tasks that require extensive manual dexterity like drawing and painting. In the past, she’s tried to use a painting device called a mouth stick, but has found it rather uncomfortable and inhibiting. Our group fixed in on developing a suitable system that can provide an experience as close to actual hand painting as possible in the form of a painting rig and a control mechanism capitalizing on our client’s available motion ranges. Given the restrictions of remote learning, we were unable to produce a final physical product, but this would certainly be an exciting next step!

WHAT WE BUILT

Our Art Enabler project allows the user to paint using a pair of wireless controllers and a mechanical painting rig that can be mounted onto a painting easel. To best explain how this painting experience is possible, we will explore each of the 3 Art Enabler project components: the controllers, the painting rig, and the interface code.

THE CONTROLLERS 
There are two wireless controllers within our project system that act as the primary user interface – the Pressure Controller and the Primary Controller. The Pressure Controller has a simple handle attached to a slider. Moving this slider can control the Z-axis movement of the drawing utensil, making it either be closer or further away from the canvas to let the user have a wider possible range of mark-making. The Primary Controller has a joystick that can be shifted to make the drawing utensil move around the canvas. It also features a clamp that allows the controller to be attached to an average wheelchair armrest and a telescoping neck that lets the user move the controller’s position up or down to better accommodate their arm.
Both controllers also feature a power switch that the user can toggle to turn the device on or off to save battery life!

THE PAINTING RIG
The mechanical painting rig can be attached to the user’s painting easel and holds the desired drawing utensil. It has two belts to control the X- and Y-axis position of the utensil to allow the user to draw what they would like! When they move the controller, the motors on the rig make the utensil holder move up, down, left, and right accordingly. The utensil holder itself also features a spring, which pressure changes based on the input of the Pressure Controller, allowing for the drawing utensil to move closer or further from the canvas.

INTERFACE CODE
This code is what makes the magic happen! Within each of the physical components, there is an Arduino and a radio module, which allows all of them to communicate with each other to send and receive informational inputs. This code takes the positions of the joystick and the slider and then sends these inputs to the rig, which adjusts its position based on these inputs.

By combining all three of these components, the user can paint or draw without needing to pick up a paintbrush or pend themselves!

A quick animation to show how moving the joystick on the right controller moves the painting rig

All three components of the Art Enabler

The painting rig with two belts to control the X and Y-axis movement of the drawing utensil

How the painting rig may look like when in use on an easel

The two controllers: On the left, the Pressure Controller can manipulate the Z-axis position. On the right, the Primary Controller can change the X- and Y-axis position. The Primary Controller can clamp onto a standard wheelchair arm.

As we were unable to create this project physically, we cannot show a video of the interactions that take place within the final project scope. Therefore, we will describe the overall interactions here to better explain the concept in context. This system requires some assisted setup before it can be operated, but, once it is set up Amy can use it almost totally independently from there on out. To complete this setup, someone (most likely either Amy’s aid or parent) must attach the painting rig to the intended drawing easel, place the drawing utensil in the holder, and place the controllers by Amy’s wheelchair or on them, using the integrated clamps. After the setup is complete and Amy decides that she would like to draw,  she can position herself in front of the easel (however close or far away that she would like) and turn on both of the controllers by flipping the power switch. To start to paint, she can push the Pressure Controller’s slider until the utensil touches the canvas at the desired pressure. Then, she can move the joystick to move the drawing utensil and begin to make her marks. The movement of the rig is mapped to the velocity of the joystick’s movement (a common practice in many video games, so hopefully it is a fairly familiar interaction) so that if Amy moves the joystick a little to the left, for example, the rig will start to move slowly towards the left until the joystick returns to the center or changes direction. If she moves the joystick very much to the left, the rig will then move quickly in that direction until the joystick returns to the center or changes direction. The joystick has a spring attached to it so that if she lets go of it at any time, it will snap back to the center and stop the rig from moving further. When she’s happy with the stroke she made, to make a new mark, Amy can use the Pressure Controller to lift the drawing utensil off of the canvas and then the Primary Controller to move it to a new position.

This process of lifting and moving utilizing the two controllers should allow Amy to draw what she would like and when she would like after a little practice to familiarize herself with the controls and movements. Hopefully, after creating a few art pieces for practice, these interactions will become more fluid and natural to allow for greater artistic expression without any need for a mouthstick!

HOW WE GOT HERE

While we came up with the idea for the Art Enabler fairly quickly after our first interview with Amy, the details and interactions of it have evolved tremendously as we continued to develop the project. At the start, we were aiming to have a single, arm-mounted controller that allowed Amy to move her arm and have that be the input that would control the mechanical rig that would be able to move along the X- and Y-axis.

Initial concept sketch

An example of our first interaction concept

However, after talking to Amy again, we realized that it would be tiring for her to hold her arm up for long enough to create a painting. Furthermore, as we continued to flesh-out our concept further and began to explore different types of sensors that could make a user input wirelessly control the painting rig, we realized that an arm-mounted controller wouldn’t provide the desired smooth drawing experience we were aiming for (this is, even if we could get such a device to work using IR sensors). Amy also expressed concern about being able to control the pressure of her mark-making like she would be able to achieve with a mouth-stick.
In light of these new insights, we pivoted our concept, deciding we needed to rethink the controllers and also the rig’s range of movement.

Sketch Page of brainstorming new types of controllers and interactions

Ideation of Prototype and detailed plannings for each Mechanical Sections

For the controllers, we decided to use potentiometer values to control the axis positions. We had to devise a way to create our own version of a joystick – which would resemble a paintbrush – in accordance with this, along with thinking about how to design the form of the controllers in order to best suit Amy’s range of motion (we would have changes in this area if we are to continue this project, but that will be touched upon in the Reflections and Lessons Learned section of this documentation page). To make the wireless communication work, we employed to use of 3 Arduinos – one in each project component – and 3 RF transceivers which could be utilized to have the Arduinos “talk” to each other.

Working on the controller’s CAD to ensure that all components could fit into the exterior frame and that the joystick could move the two slide potentiometers

Working on the controller’s CAD to ensure that all components could fit into the exterior frame and that the joystick could move the two slide potentiometers – clear view of the controller

For the mechanical rig, we decided that we needed to accommodate for some type of pressure sensitivity. As we were still working through the problem of how to lift the drawing utensil off and onto the canvas to better replicate a typical drawing experience (which isn’t only one continuous line), we decided to integrate these two issues into one solution: the Pressure Controller and adjustable spring within the rig’s drawing utensil holder. These would allow the user to move the utensil as close to or as far away from the canvas as they would like, allowing for them to “lift” the utensil off the canvas or draw fainter stokes.

The rig’s new spring

We also worked closely with our professor, Zach, to iterate on the mechanical rig’s CAD to ensure that there was smooth, discrete movement along every axis. With our first rig design, which only featured one belt, when Amy moved the controller, the rig would move smoother and quicker in one axis compared to the other. After talking with Zach about this issue that came to light, we again pivoted towards our final two-belt design, with each driven by its own motor.  One of these belts would allow the rig to move along two threaded rods to go left and right. The other drives the up and down motion of the device.

First Rig Design

Final Rig Design

Once we figured out our components and sensors, it was time to write the code to bring it all together. However, we discovered that it is hard to write code without being able to physically test its viability. Utilizing TinkerCAD, we were able to make some interaction mock-ups to test our ideas but, in the end, this part of our project has the most theoretical functionality.

A document detailing the different electronic components and what each should do to ensure proper alignment with the physical components and the coded components

REFLECTIONS and LESSONS LEARNED

After the final presentation, we got a lot of positive feedback from other people. Someone said this would be a great idea and the most marketable product out of the presented items. However, to produce this as a market product, the rig section seemed a little unstable compared to the controller section. This might have happened due to less supporting structure for the rig connected to the canvas. Therefore, if we have a little more time with this final project, we would have built and designed the supporting structure section of our machine.

Also, working remotely with other people created a lot of communication issues among group members. We mostly used zoom and text messages to communicate with each other about our project’s progress. We also used class time to divide our works and share comments between our original works. However, we did not have a chance to combine all of our works until the day before the actual presentation. Also, it took several hours to receive a response from team members after sending a text message. Fortunately, we had enough time to discuss our own products and share ideas with team members but we had a lot of communication issues between team members.

Lastly, working with Amy provides more insight into different types of clients. Through the interview, we were able to learn what kind of hardship do paralyzed people face in daily life. For instance, Amy was having a hard time holding some items due to her spine injuries. To draw pictures, she used a special mouthstick to hold brushes. However, this action made her jaws tedious and limited her drawing hobbies. Through this project, we have suggested some possibilities she might able to use to solve her uncomfortable situation.

TECHNICAL DETAILS

Code

/* Art Enabler Prelim Code */

// Preface to the code shown here:
/* First and foremeost, this code is intended to drive a system
 * that can't realistically be assembled in TinkerCAD.
 * As a result, there is a lot of 'hacking' going on to
 * to at least produce the requisite result for the project.
 *
 * The reason that it can't be assembled in rather simple -
 * none of the required components exist in TinkerCAD,
 * and the closest availabe replacements aren't nearly wieldy
 * in the TinkerCAD environment.
 * 
 * All single lines of code in multiline style comments are
 * for operations not supported in TinkerCAD
 *
 * | PINS |
 * POSPIN_X : Reads sensory data to set internal X position
 * POSPIN_Y : Reads sensory data to set internal Y position
 * POSPIN_Z : Reads sensory data to set internal Z position
 *
 * PX : Workaround pin to directly send X pos info to motor
 * PY : Workaround pin to directly send Y pos info to motor
 * PZ : Workaround pin to directly send Z pos info to motor
 */

// Radio control code adapted from:
// https://create.arduino.cc/projecthub/MisterBotBreak/how-to-communicate-using-433mhz-modules-9c20ed

// Smoothing code adapted from:
// https://courses.ideate.cmu.edu/16-223/f2020/text/code/FilterDemos.html

// Libraries
/* #include "VirtualWire.h" */

// Pin assignment
const int POSPIN_X = A0;
const int POSPIN_Y = A1;
const int POSPIN_Z = A2;

/* const int TRANPIN = 2; */

// Following pins are used for sending data to the "stepper" motor
// as a TCAD workaround
const int PX = A3;
const int PY = A4;
const int PZ = A5;

// Global settings
/* In a world where this is being built, it would be worthwhile
 * to add some additional hardware components to adjust settings
 */
static float resolution = 100; // Mapping resolution, gets around map type constriction

static float scale_z = .1; // Scales motion off canvas
static float scale_x = .001; // Scales planar motion
static float scale_y = .001; 

static float min_x = 0.0;
static float max_x = 1.0;

static float min_y = 0.0;
static float max_y = 1.0;

static float min_z = 0.0;
static float max_z = 1.0;

static float pos_x;
static float pos_y;
static float pos_z = 0; // Corresponds to off canvas

// These variables were shoddily taped on after realizing TinkerCAD
// is insufficient:
static float prev_x = 0;
static float prev_y = 0;
static float prev_z = 0;

void setup() 
{
  // Default to center
  pos_x = max_x / 2;
  pos_y = max_y / 2;
  
  pinMode(A0, INPUT);
  pinMode(A1, INPUT);
  pinMode(A2, INPUT);

  pinMode(2, OUTPUT);

  // Setup radio transmission
  /* vw_setup(2000); */
  
  // Patch for TCAD
  pinMode(PX, OUTPUT);
  pinMode(PY, OUTPUT);
  pinMode(PZ, OUTPUT);
}


// Smooths input based on a previous value
float smoothing(float input, float prev, float coeff=0.1) 
{
  float difference = input - prev;  // compute the error
  prev += coeff * difference;       // apply a constant coefficient to move the smoothed value toward the input
  return prev;
}


// Used to clamp ranges
float clamp(float input, float rangemin, float rangemax) {
  float value = max(input, rangemin);
  value = min(value, rangemax);
  return value;
}


// Obtains and maps a given pin, assuming the input is a position
float map_pin_read(const int PIN) 
{
  int reading = analogRead(PIN);
  float map_val = map(reading, 0, 1023, (int) -resolution, (int) resolution) / resolution;
  return map_val;
}


float update_pos(float pos_v, float min_v, float max_v, float scale_v, const int POSPIN_V) 
{
  float dv = scale_v * map_pin_read(POSPIN_V); // Get pin input and map to a val
  pos_v = smoothing(pos_v + dv, pos_v);
  pos_v = clamp(pos_v, min_v, max_v);
  return pos_v;
}


void update_poses() 
{
  pos_x = update_pos(pos_x, min_x, max_x, scale_x, POSPIN_X);
  pos_y = update_pos(pos_y, min_y, max_y, scale_y, POSPIN_Y);
  pos_z = update_pos(pos_z, min_z, max_z, scale_z, POSPIN_Z);
}


void transmit_pos() {
  float poses[3] = {pos_x, pos_y, pos_z};
  /* vw_send((byte *) poses, 3 * sizeof(float)); */
  /* vw_wait_tx(); */
}

// In all honesty these functions could absolutely be broken,
// but it exists only as a compatability measure for TinkerCAD
// after having written other code intended for a real system.
int SPEED_COMP(float old_val, float new_val, float v_max) {
  float res = resolution;
  float diff = new_val - old_val;  
  return map(diff, 0, res * v_max, 0, 1023 * res) / res;
}

// Compatability function to circumvent lack of IR receivers
void TCAD_COMP() {
  // Calculate difference between new val and old for speed calc
  int X_SPD = SPEED_COMP(pos_x, prev_x, scale_x);
  int Y_SPD = SPEED_COMP(pos_y, prev_y, scale_y);
  int Z_SPD = SPEED_COMP(pos_z, prev_z, scale_z);
  
  // Store previous values
  prev_x = pos_x;
  prev_y = pos_y;
  prev_z = pos_z;
  
  analogWrite(PX, X_SPD);
  analogWrite(PY, Y_SPD);
  analogWrite(PZ, Z_SPD);
}

void loop() {
  delay(20);
  update_poses();
  /* transmit_pos(); */
  
  /* BELOW INCLUDED FOR TCAD COMPATABILITY */
  TCAD_COMP();
}

 

Schematic and design files

 

Controller CAD Google Drive Link

https://drive.google.com/drive/folders/19qEuezqMGOownMQQUnk6_TxUlWhi9VCs?usp=sharing

Rig CAD Google Drive Link

https://drive.google.com/drive/folders/1j7hFiTiPuqBnq-hJt8R1rr6JDi5l3Oel?usp=sharing

]]>
Danger Sensor by Brenda Team 2: Final Documentation https://courses.ideate.cmu.edu/60-223/f2020/work/brenda-team-2-danger-sensor/ Fri, 18 Dec 2020 20:14:27 +0000 https://courses.ideate.cmu.edu/60-223/f2020/work/?p=12225 1. Introduction

For the final project of our Physical Computing class, we were tasked with building an assistive device specifically catered towards one person with disabilities in partner groups. Our group was partnered with a client named Brenda, and upon learning about the circumstances of our assignment, we initially interviewed her to get a better sense of her disabilities and brainstorm ideas of potential assistive technologies that could assist her in her daily life. 

More details about our interview results can be found here: https://courses.ideate.cmu.edu/60-223/f2020/work/interview-with-brenda/

After ideating, revising, and finally settling on an idea, and due to the remote nature of our class, we individually created prototypes to test various functionalities of our idea, more details of which can be found here: https://courses.ideate.cmu.edu/60-223/f2020/work/team-brenda-2-prototype-documentation/

 

2. What We Built

Our final project assistive device is a danger sensor, akin to the back dash cams that cars possess, meant to help Brenda detect any obstacles or fallen objects behind her out of visibility due to her chair. More specifically, our device takes advantage of distance sensors mounted behind Brenda’s assistive chair to sense for any inconsistencies behind her and alert Brenda of any obstacles through a vibration and a light visualizer to gauge the position and distance of those items sensed behind her.

IR Option (perspective)

IR Option (Side)

IR Option (Top)

Ultrasonic Option (Perspective)

Ultrasonic Option (Side)

Ultrasonic Option (Top)

Communication Module (Vibrator + Three LED’s)

Left: Mounting for IR sensors Right: Mounting for US sensors

 

3. How We Got Here

Our project development could be largely broken down into five parts following the initial interview with Brenda: Initial Design Idea, Pivoting Part, Prototyping, Research and Design Development. 

  1. Initial Design Idea

We had the first interview with Brenda to gauge what kind of trouble that she has in her daily life and to brainstorm ideas for solutions. She presented us with a multitude of problems that we could try to address. Only problem was that most of them seemed addressable only through mechanical means rather than electrical. (Full interview documentation could be found here)

Out of the several different ideas that she gave us, we decided to address her problem with the current footholders she has. These footplates are hard for her to reach down – and she also has a problem of not being able to fold them up herself. She always needs an assistant to do so, which made her wish the process could be automated, and that she doesn’t have to lift her feet every time. 

As an initial design, we decided to develop an automated foot-holder system that uses the IR remote control and a telescoping system that would expand and contract at her wish. 

However, we soon realized that the solution would include a mechanical system that would be too hard for us to solve. The system itself would be hard to design, and we would have to prove that it would hold the weight of her feet. These things were way out of scope of our capabilities. We had to make a choice here to abandon this idea and pursue a different one.

Sketch describing our initial design idea – an automatic foot holder!

Telescoping Mechanism (front, folded)

Telescoping Mechanism (side, folded)

Telescoping Mechanism (side, unfolded)

  1. Pivoting Part

After realizing that the initial idea wouldn’t work, we decided to hold a second interview with Brenda. Fortunately this time, after speaking with her and letting her know that we needed a solution that includes electrical components rather than mechanical, we were able to get quite a few new ideas that fit the scope of this project. (Full second interview documentation could be found here) Out of the few options that we had, we decided to address her problem of having difficulties seeing behind her because of the fixed headrest and the immobility of the left side of her body. She’s just learned to be careful when looking back or moving backwards. We thought of making something of a car backcam that would let her know of what’s behind her. But instead of a camera, we decided to use a set of ultrasonic sensors to sense the objects behind her. As an output system, we decided to use a set of LEDs and a speaker to let the user know of the proximity to the objects.

  1. Prototype

Following the general description of our initial design sketch, we developed a prototype as illustrated below. The ultrasonic sensors would be attached on the back of the wheelchair. The speaker + visualizer module (made of three LED’s with a diffusing material on top of them) would be mounted at the end of the armrest and would communicate with the user about the proximity of the objects. The visualizer would be in a shape of a semicircle, divided into three with each third representing each ultrasonic sensor. Each of them would have three states: white (no objects close by) dim red (pink) (objects are kind of close by) and red (objects are very close). The speaker would have a corresponding output. No sound (no objects close by) slow beeping (objects are kind of close by) and high-pitched, fast beeping (objects are very close). 

Division of Work for Prototype

Close-up of the single ultrasonic sensor I had modeling the part of the device meant to have the arrangement of 3 ultrasonic sensors.

Close-up of the LED arrangements and speaker placed there modelling the alert/visualization system for the user depending on the ultrasonic sensor feedback.

Overall scale of my prototype with all the essential electrical components mounted together.

Visualizer display

Visualizer (deactivated)

Placement of the sensor

Visualizer (activated)

After presenting our prototype and receiving feedback, we outlined the general direction that we want to head for the next step in the project. We received more photos from Brenda that show blindspots for her wheelchair that we could address using our new device. Following are the changes we decided to implement on our design moving forward. 

  • Change to vibration instead of auditory beeping feedback
  • Lower detection range of ultrasonic sensors to detect pets & fallen objects
  • Keep version with ultrasonic sensors all active, visualizer will show which direction detects closest thing corresponding to light intensity
  • Keep awareness about armrest prone to easy damage, add to inside of pocket attached to armrest instead
    • Will get image of armrest area and blind spot on back of chair
  • Dogs might be able to hear the ultrasonic ping and not like it – find a higher-frequency ultrasonic device.
  1. Research

The Research part and the Design Development were done simultaneously, following the Gantt Chart that we made after finishing the prototype presentation.

Gantt Chart

The research was primarily for finding a component that could replace the 40kHz-ultrasonic sensor since the dogs might be able to hear up to 60kHz-ultrasound be irritated by it. We were able to find a good alternative to the ultrasonic sensors for this project – the IR sensors! They use Infrared waves to measure the proximity to the objects. However the IR sensors have a much smaller cone of range than the ultrasonic sensors, as illustrated in the diagram below. 

Range for US Sensor vs IR Sensor

(The full document for alternative sensor could be found here)

However, we figured that the smaller cone of range could be compensated by putting several ones of them in a row. 

  1. Design Development. 

In the Design Development part, we made changes to the design appropriate for our new concept – a detecting system that would sense pets or fallen objects that may get stuck in the space below her wheelchair or would be extremely difficult for Brenda to see. 

Instead of placing the sensors in the middle of the back of her wheelchair, we decided to place them lower. Following the feedback that the speaker would irritate both the user and the animal friends, we decided to use a pancake vibrator instead of the speaker. We decided to leave the 3-LED module in to have a bigger range of communication capabilities. But instead of each one representing each sensor, we changed the interaction so that each LED would represent the level of proximity. At the end, we decided to include options for both ultrasonic sensor and infrared sensor as you can see in the renders. Ultrasonic sensor has an advantage that it would be able to scan a wider range – 180 degrees all around without no blind spot, but IR sensor would be able to sense not just a fallen object but a grade change and alert the user of it.

 

4. Conclusion and Lessons Learned

After the final presentation, we had a wonderful experience receiving insightful and thoughtful feedback from everyone, many of which appraised our idea and process, making us overjoyed for having our strenuous efforts recognized. In two separate comments, our extensive research process was complemented, being called “really thoughtful and well-researched” and “lloved seeing all the considerations”. Concerning the more specific feedback we received about our design process, we received a comment about our auditory vs. sensory feedback system considerations, saying that they “really enjoy the super high pitched sound,” however we think that the writer may have actually slightly misunderstood our presentation and regarded the speaker as our final settled decision when we actually finalized it with the vibration motor. In another aspect, someone commented on our considerations for our sensor research, citing how “it was really interesting to think about the pets’ state,” which definitely made us feel all our efforts were with it.

On the topic of the process, due to the nature of the semester, our project was done entirely through us collaborating remotely, which was definitely an interesting but challenging experience. What worked most effectively for us was definitely our Zoom collaboration sessions where  we just stayed on call with each other and finished our specific allocated sections of assignments. While they were very productive, continuously setting up a time to meet up that worked for both of us purely through text was frustrating at times, and in the end, actually caused us to do more of splitting up work and finishing on our own time, which evidently did slow down productivity but still effectively got the work done. Towards the end, we had less and less collaboration sessions due to the busy nature of our other classes, so as a future consideration, we could have put in more effort towards the end to maintain our prime productivity rate.

Not only was the experience of collaborating remotely somewhat new, but it was also our first time working with a disabled person. The most interesting aspect was definitely seeing just how different our lives and mindsets were, since our disabled partner, Brenda, had severely limited functionalities and thus had a completely different experience even with everyday trivial tasks. It was definitely noticeable how her reliance on all these assistive technologies also shaped her mindset of what she considered as problems that we would never have thought of from our perspective, giving us a lot of insight into what life looks like for people in different circumstances than us. This experience, and that of the final project’s concept as a whole has definitely broadened our horizons and taught us to look at things in a different way while practicing thinking from other viewpoints.

 

5. Technical Details

TinkerCAD Screenshot

Schematics

Code:

/*
 * Final Project
 * Arleen Liu (arleenl), Claire Koh (juyeonk)
 * 2 hours
 * 
 * Collaboration: None
 * 
 * Challenge: Figuring out best and most accurate feedback
 * mechanisms from the sensors for the visualizer made of 
 * LEDs and the vibration motor for an intuitive understanding.
 * 
 * Next Time: Research and experiment even further with 
 * different feedback mechanism for ease of the user.
 * 
 * Description: An assistive device meant to detect the 
 * presence of any obstacles behind a person and provide 
 * vibration/visual feedback to the user about the proximity
 * of potential objects blocking the way.
 * 
 * Pin mapping: 
 * 
 * pin |     mode     | description
 * ----|--------------|------------
 * 2    INPUT           Ultrasonic 1 Trig
 * 3    INPUT           Ultrasonic 1 Echo 
 * 4    INPUT           Ultrasonic 2 Trig
 * 5    INPUT           Ultrasonic 2 Echo 
 * 6    INPUT           Ultrasonic 3 Trig
 * 7    INPUT           Ultrasonic 3 Echo
 * 8    INPUT           Ultrasonic 4 Trig
 * 9    INPUT           Ultrasonic 4 Echo
 * 10   INPUT           Ultrasonic 5 Trig
 * 11   INPUT           Ultrasonic 5 Echo
 * A2	OUTPUT          Red LED 1
 * A3	OUTPUT			Red LED 2
 * A4   OUTPUT          Red LED 3
 * A5   OUTPUT          Vibration Motor
*/ 


const int MOTOR_PIN = A5;
const int LED1_PIN = A4;
const int LED2_PIN = A3;
const int LED3_PIN = A2;

const int TRIG_PIN1 = 2;
const int ECHO_PIN1 = 3;
const int TRIG_PIN2 = 4;
const int ECHO_PIN2 = 5;
const int TRIG_PIN3 = 6;
const int ECHO_PIN3 = 7;
const int TRIG_PIN4 = 8;
const int ECHO_PIN4 = 9;
const int TRIG_PIN5 = 10;
const int ECHO_PIN5 = 11;

const int SONAR_NUM = 5;      // Number of sensors.

int TRIG_PINS[SONAR_NUM] = {
  TRIG_PIN1,
  TRIG_PIN2,
  TRIG_PIN3,
  TRIG_PIN4,
  TRIG_PIN5
};

int ECHO_PINS[SONAR_NUM] = {
  ECHO_PIN1,
  ECHO_PIN2,
  ECHO_PIN3,
  ECHO_PIN4,
  ECHO_PIN5
};

void setup() {
  Serial.begin(115200); // Open serial monitor at 115200 baud to see ping results.
  pinMode(MOTOR_PIN, OUTPUT);
  pinMode(LED1_PIN, OUTPUT);
  pinMode(LED2_PIN, OUTPUT);
  pinMode(LED3_PIN, OUTPUT);
  
  pinMode(TRIG_PIN1, OUTPUT);
  pinMode(ECHO_PIN1, INPUT);
  pinMode(TRIG_PIN2, OUTPUT);
  pinMode(ECHO_PIN2, INPUT);
  pinMode(TRIG_PIN3, OUTPUT);
  pinMode(ECHO_PIN3, INPUT);
  pinMode(TRIG_PIN4, OUTPUT);
  pinMode(ECHO_PIN4, INPUT);
  pinMode(TRIG_PIN5, OUTPUT);
  pinMode(ECHO_PIN5, INPUT);
}

long currentmillis;

void loop() {
  float minDist = 100000000.0;
  for (uint8_t i = 0; i < SONAR_NUM; i++) {

    // Clears the trigPin
     digitalWrite(TRIG_PINS[i], LOW);
     delayMicroseconds(2);
     // Sets the trigPin on HIGH state for 10 micro seconds
     digitalWrite(TRIG_PINS[i], HIGH);
     delayMicroseconds(10);
     digitalWrite(TRIG_PINS[i], LOW);
     // Reads the echoPin, returns the sound wave travel time in microseconds
     float duration = pulseIn(ECHO_PINS[i], HIGH);
     // Calculating the distance
     float distance = duration*0.034/2;
    //Serial.println(distance);
    if (distance < minDist) {
      minDist = distance;
    }
  }
  Serial.println(minDist);
  if (minDist >= 80) {
      digitalWrite(LED1_PIN, LOW);
      digitalWrite(LED2_PIN, LOW);
  	  digitalWrite(LED3_PIN, LOW);
      Serial.println("No LED's");
  	}
  if (minDist <= 80 && minDist > 60) {
    analogWrite(MOTOR_PIN, 255);
    digitalWrite(LED1_PIN, HIGH);
    Serial.println("1 LED lit up");
  }
  if (minDist <= 60 && minDist > 40) {
    analogWrite(MOTOR_PIN, 255 * (3 / 4));
    digitalWrite(LED1_PIN, HIGH);
    digitalWrite(LED2_PIN, HIGH);
    Serial.println("2 LEDs lit up");
  }
  if (minDist <= 40 && minDist >20) {
      analogWrite(MOTOR_PIN, 255 * (1 / 2));
      digitalWrite(LED1_PIN, HIGH);
      digitalWrite(LED2_PIN, HIGH);
      digitalWrite(LED3_PIN, HIGH);
      Serial.println("3 LEDs lit up");
    }
  if (minDist <= 20) {
    Serial.println("LEDs blinking");
    if (millis()- currentmillis >500) {
      	digitalWrite(LED1_PIN, LOW);
      	digitalWrite(LED2_PIN, LOW);
      	digitalWrite(LED3_PIN, LOW);
      	currentmillis = millis();
    } else {    
      	digitalWrite(LED1_PIN, HIGH);
      	digitalWrite(LED2_PIN, HIGH);
      	digitalWrite(LED3_PIN, HIGH);
      }
  	}
  
  delay(10);
  //Serial.println();
}

 

Design File (DXF) Download here

]]>
ToneFlex by Team Elaine: Final Documentation https://courses.ideate.cmu.edu/60-223/f2020/work/toneflex-by-team-elaine-final-documentation/ Fri, 18 Dec 2020 00:54:59 +0000 https://courses.ideate.cmu.edu/60-223/f2020/work/?p=12056 What is ToneFlex? Our product is a musical device that caters to those who struggle with grip force and strength. Before we were able to come up with our final concept, there was an extensive process. Starting off with interviewing with our client to get a better stance on how we can work together to design a device that caters towards them to brainstorming potential ideas. Unfortunately, we were stuck for a while about what direction we should go with the project. A project that is useful for an important task? or a fun activity? Because of all of our questions, we had no clue where to start. This type of confusion was not only because of how many potential directions we could take, but also because our client has had so much experience in this field, because of her education in biomedical engineering and rehabilitation science, we were unsure of what the expectations were. The suggestions given by our client were too complex for our expertise, so we had to pivot and look in a different way to work with Elaine. As we talked and got to know Elaine better, we were able to find more about her own personal life like hobbies and interests. Overall, we were able to learn that Elaine enjoys music but has always struggled with finding instruments that were accessible to her. So, that’s how we started ToneFlex! Her experience helped us make our device goal based on her description of how instruments were inconvenient and difficult to use. From then, we were able to start prototyping and doing user research to finalize our product. Our device’s goal is to aid Elaine, who has difficulty with devices that require the use of grip force and strength, to play musical devices as if she was playing those actual instruments.

Device Summary

ToneFlex is a musical device that uses both sense and touch to manipulate pitch and volume. The end goal of this device is to allow users who struggle with grip force and strength to have an easier and more efficient way to create music. There are two parts to the product. The left side uses sensors to change the pitch based on how far or close a user’s hand is. On the other side of the device, it has two pushbuttons and a potentiometer to play, stop, and change the volume. The two parts are able to be clipped on and attached to the arm of the chair or any surface.

This is our final device design!

Before getting to our final product, we went through various types of research to help us conclude what we needed. As a team, we were able to conclude that having extra feedback meetings with Elaine was extremely helpful as they would aid to get rid of any assumptions we would have while designing. Our biggest takeaway for the design was making sure that the device uses both hands as Elaine has varying mechanical applications for each hand. She emphasized how it is much harder to only use one hand with her because of the constraints she faces with her hands, arms, and overall extension. From those discussions and collaboration, we were able to start prototyping our idea.

A big part of our process was focusing on the specific components to use for the device. On the left side, we used an Adafruit Sensor to get the input for the pitch, because it is much more detailed than the typical proximity sensor. We purposely only used one main interaction component for the left device since that was one of Elaine’s request.

This is our map of where the component is on the left side of the device.

Then, on the right side, we used two pushbuttons and a potentiometer. The buttons are used to play and pause the pitches on the digital synthesizer so our user can make their own music. The potentiometer is to adjust the volume of the pitches which in unison with the manipulation of pitches can produce fun music.

This is the map of the right side of our product.

In order to give a better representation of how our device works, we tried having a quick animation to demo. Due to the limited size of the video, we had to cut the clips into parts. The first clip shows how the input sensor interacts with the user. As you can see there is a hand that comes in a waves in between the device. We made the design of the left device to be curved. This design choice was intentional as we user tested and realized that having a flat space was much more uncomfortable because the user would have raise their arm slightly to reach the distance. Having the space curves, allows the user to comfortably rest their arm on the device while pivoting side to side through the space to change the pitch.

This is an animation showing the physical interactions with our device

With our final model, we used SketchUp to render it. That platform was extremely useful in working remotely together, because it allows various people to work on the file together.

Here is a gif of our device from various angles. The quality is not as strong because it was converted to this format, but is a great representation of our design.

Here is a front view of our device.

Here a the backside of our final device design!

Here is the left device’s side profile.

Here is the right side’s profile view.

For our TinkerCAD experience, we were unable to show it exactly the same, so we used slightly different components to still explain our overall interactions. What happens in the real device is that the readings received from the sensors are converted to MIDI format and then sent to a digital synthesizer over the serial port from which the music is played. However, TinkerCAD does not have a Serial port communication to external applications as an option and so instead we used another Arduino that is connected by a serial to the main Arduino which receives the signal in MIDI format and displays it on the LCD to indicate what was sent.

This is a diagram of components in our TinkerCAD!

In our demo, you are able to see how the components are supposed to interact to get a better understanding of our device’s goals. On the left bottom corner, you are able to see what is specifically going on with our Arduinos.

This is our TinkerCAD demo!

Lastly, because we used an outside synthesizer, we were unable to show the sounds on TinkerCAD, but we can show a quick demo of the actual software being used for our device. The options on this software allows users to create actual instruments from wind, brass, and even strings!

This is our software sound demo! Please lower your volume, because the sound of this video is quite loud.

This is our storyboard of how we imagine Elaine using our device!

Our short story gives a small glimpse of how we would want Elaine to interact with our device. While Elaine listens to music, she gets inspiration and wants to create her own! All her life, she has been unable to play most instruments due to her limited movements, but she has always still wanted to at least try one day. That is when she uses ToneFlex by attaching it to her chair and being able to produce her own music. Once she is done making her music, she is able to share it with anyone and continue her day.

Process

Getting started with this project, we were given the opportunity to interview our client, Elaine Houston, so that we can collaborate together too design an assistive device that caters towards her. Before our meeting, as a team, we sat down and debriefed our intent, goals, learning objectives, and secondary research. Because we had already learned a little bit about Elaine through our secondary research, our conversation was extremely comfortable and we were able to relate and get things started right away. This stage did take longer than we expected, but in the end, it was extremely beneficial as the extra preparation helped us when we had a problem arise later on in the process.

Here is the brief we created before starting the project.

This was the document that we put all of our secondary research about Elaine.

 

 

 

 

 

 

 

 

From there we were able to learn more about Elaine.Our interview with Elaine took many turns, covering subjects we could not have foreseen. Elaine shared a lot of her knowledge and wisdom of past projects she has been a part of in the world of inclusive design. She continued to come back to an emphasis on making things experiences that aren’t just accessible to one group of marginalized people, but rather accessible to all including those that happen to be marginalized. Between anecdotes, we discussed her experiences with a service dog, difficultly with fine motor objects, and how awkward others may feel the need to make social interactions with her.

 If we were to go back and interview Elaine again for the first time, we would approach our structure differently. The questions we asked to nudge the conversation were super open ended in nature, which can be good until the conversation wanders out of the scope of the project. Our structure would benefit from picking a tight set of objectives we needed to hit and walk away with and building guiding questions around them. It would be best to state these objectives  at the top of our conversation, almost like a brief outline. That way, everybody involved at least has a sense of what we want to hit and the pace at which we are hitting those points. 

After our interview, we were somewhat stuck. Many of the issues we discussed with Elaine fell closer to the realm of mechanical engineering. These were still valid problems, but likely too big for us to chew as undergraduates in an intro level course. Elaine’s is always tinkering, and many of the small needs she has encountered in her life she been proactive in finding solutions to for herself. To have a lack problems in Elaine’s life for us to tackle was a strange problem, and we were frankly stuck. 

We spent time taking a step back. Maybe if we can’t pinpoint a need to take on, we can find other ways to enrich Elaine’s life through her interests. Visiting her personal website, we found a page detailing what Elaine considers to be her favorite “theme” songs. We knew at least listening to music was an important part of her life, and we began to think about how the experience of making it would be for her.  With the theremin as a starting point of inspiration, we began to prototype different aspects of the experiences.

We divided our explorations into areas of ergonomics, interaction, and software. We knew that all of which would come together to inform our final product. In the ergonomic testing, we spent time understanding Elaine’s range of motion firsthand. Connor built a quick device out of an exercise band and rope which would limit the wearer’s range of motion. We knew that Elaine had a maximum extension of 90 degrees, but this activity revealed there was no need to push that vertical range of motion to its max.

This is a gif our interaction.

Here is a diagram of our ergonomic exploration.

Jina worked on interaction prototyping which focused on manipulating conductivity on electric tape. Different graphic arrangements of tape laid a map for different journeys of motion and ways to manipulate sound. The activity raised useful questions about where this device will be located, which complemented the ergonomic prototyping. In addition, we looked another way of getting input by using sensors. Before divining into the design elements, as a team, we sat together and wrote done elements that we thought would work and then was able to iterate.

Our first design meeting notes about what direction to take.

This is a diagram of our first design prototype that use the function of touch!

These were the sketches made before actually physically making the prototype.

This is our second prototype that uses sensors.

This is a birds-eye view of the second prototype.

These are our sketches for our second prototype design.

Sruti did work with software prototyping to understand how we could translate our physical interactions to manipulate MIDI sounds. This was fundamental to giving the instrument a voice that someone would want to interact with.  The work done revealed how we might manipulate volume and tone as  flexible variables.

Here is a screen grab of the software that was being tested for the synthesizer.

From those prototypes, we were able to user test and get feedback about our direction. It was extremely effective as there was co-creating element where the users actually helped by sketching their feedback and potential next steps. On the left, you can see some of the notes we took while observing our users. On the right, you can see the suggestions that we received by the testers. This type of exercise was extremely helpful in aiding our team to understand what were the gaps that we failed to see while working.

Here were the overall notes from one of the user tests.

We were anxious to present our progress to Elaine, as we had taken liberties in the direction we had moved since our interview and were hoping it would be a pleasant and engaging surprise. Luckily for everyone, our direction resonated with her. Elaine shared stories about her time as a young kid in her school’s music class learning to play the recorder. The instrument requires ten fingers to play all notes on, so Elaine was frustrated she couldn’t participate. She showed us an adapter which was recently designed that she was able to 3D print to play the recorder today. It used a mechanical solution similar to that of a clarinet or saxophone, where certain tabs could be pressed to cover multiple holes on the instrument at the same time. Our project was of interest in that it differs from the start, where there is no need for an adapter to make it different from anybody else’s.

Elaine still had useful feedback for us after we shared our prototypes. In terms of ergonomics, Elaine shared valuable insights to add to our knowledge of her range of motion. She talked about how gravity is difficult to constantly work against, so when her arm is at a certain elevation it is easiest for her to continue working with it at that elevation. However, she noted that it is not comfortable to be interacting with interfaces directly in front of her as it requires muscle strength she does not have. As for interaction, she was interested in the theremin as a starting point and gave us the green light on proximity sensors as a primary interaction.

We kept all of this feedback in mind as we began to bring our prototyping together. In developing the form it would take, we knew it would work best attached to the arms of Elaine’s chair. That is where her arms already naturally rest. This rules out earlier prototypes with interfaces on the side of the chair and earlier ones we had teased that were set up like a centered control board. Attaching the instrument to the chair had to be easy, yet structurally sound. The controls of the instrument had to embody the tone and volume controls which Sruti had been working on manipulating. Drawings were initially done with a rough concept in mind, which were then iterated many times digitally in Sketchup to arrive at our final product.

This is our sketch for our final design direction!

This was the physical model we quickly made to test out if the interaction made sense.

This is a gif to quickly explain the movement we were planing for our device.

Here are all the types of models we tried our in sketches before getting to our final!

After making these mock-ups, we were thankful enough to do one more user testing with the people we tried earlier on in our process. From there, we were able to get feedback on how having the left device only straight brought constraints to their arms as they would have to slightly lift them when moving father away from the sensor, causing us to further iterate and make our final left device version curved to bring about more comfortability.

Reflection

After our final presentation, we were able to have wonderful discussions and get lots of feedback on our device. It was extremely helpful to get extra time to talk with new people and get new perspectives on what we did. First and foremost, Elaine’s feedback was the top of our priority! When we talked with her at our final demo presentation, she emphasized how she appreciated how interesting our concept with the changing components on our device was. In addition, she mentioned that all the specific constraints she told us about we all met! She suggested on having a bigger knob for the volume or even using a slider to make the that component have more dexterity. As of right now, our knob works, but it would be much easier for her with the newest suggestions.

Slides were super clean (and gorgeous!!) and easy to understand which allowed for your communication of concept to really shine. I enjoyed all of the prototype iteration you did to work towards an intuitive and comfortable experience.

Adding on, we got a lot of positive feedback from the other classmates and clients to the point where they asked how this device could be accessible to them. Those types of questions helped us have an eye-opening experience to understand how different everyones’ abilities are and how everyone prefers different types of interactions and constraints. Though our focus for this project was solely on Elaine, when talking with other people, we realized that others face different disabilities that may cause them to find Elaine’s constraints their main way of functioning. For example, two people we talked to mentioned that they prefer having only one hand in use for the device rather than two. Our decision of using two separate devices were due to Elaine’s request, because it is easier for her to use simple movement with her left hand and slightly more complex ones using more strength with her right hand. But from discussing with other people, it was extremely interesting to hear about different people’ abilities, causing us to reflect on how difficult accessibility is to implement overall.

Later on, we were able to reach out to Elaine to talk to her about potentially furthering the accessibility use of our device as the other client’s gave us feedback based on their constraints. When we brought up the one handed device suggestions, she emphasized that it is much more difficult for her. She continued to talk to us about how those types of decisions are very dependent on the nature of the disabilities. “Some people will prefer both hands which require finger control that some may not have, while others maybe prefer only one hand.” From that discussion, we realized how important it is to find a sweet spot with our overall mechanism to allow various types of users participate like using eye-tracking or head movement which makes the device have a wider range of users which is something we would love to look into and potentially add it into our device.

I think it was a very advanced project but with an intuitive mechanism. Loved how you guys connected it to another audio software.

The overall project experience went much better than expected! A big concern that our team had was the varying levels of expertise in software, electrical, and fabrication, but we were all able to be transparent and truly use each others’ strengths to better the project. Though we were remote, because we were continuously communicating with one another, we did not face any problems as a team. In addition, if someone needed help, it was always comfortable to reach out and get advice from others in our team. The biggest takeaway from this experience was truly learning all these various types of co-creative platforms to help working remotely. For example, SketchUp is flexible enough for all of us to collaboratively work together on the final design, allowing us to make sure that this device concept and design was created as a whole team. There isn’t anything from a teamwork standpoint where we struggled and would continue to work like we did if we were to work again together.

Moving on, as a team, we appreciated the high level of expertise that Elaine has, allowing her to push us with our thinking and overall project. At first, it was concerning to us because we were worried that we would not be able to meet up to her expectations, since she has so much experience with developing devices especially in rehabilitation science. However, once we started to work together, Elaine brought so many insightful feedback and new knowledge that really helped our device get to another level. One thing that we did have a challenge with was keeping in communication with Elaine, because of her busy schedule, causing us to delay our timelines during our process. In the future, it would be better if we could schedule a weekly meeting time with our client so that we are able to keep each other informed about what is going on, rather than having to wait until there is a response. In addition, we hoped that in the future, the guidelines and overall goals are aligned with our client because in the beginning there seemed to be some misunderstanding with the overall level of the project we were meant to make. Further iterating, it seemed that the problems that Elaine brought were much more difficult than our skill level, causing us to have to pivot and be stuck for a little bit of our process. In the end, we really appreciated Elaine’s time and contribution to our overall collaboration as we were all able to learn and use the lessons from the whole semester.

Not only did this project help us utilize and learn new skills, it has helped us become more aware about inclusiveness for those with disabilities. All of us were aware, but was not as knowledgable as we are know with the struggled that people with disabilities face. Through hearing about Elaine’s tough experiences facing accessibility issues, it was extremely sad to hear how there are also  so many other people who also face those hardships. It is unfortunate to see how in most designs today, they neglect the needs of the disabled. This experience has helped us truly focus and understand what it means to work and make a Universal design. When designing, especially in the beginning stages of our process, there were times where the mobility issues would slip out of our minds, causing us to have to take steps back with our progress to fix those issues. These types of experiences were really great learning lessons for us overall. While we were user-testing, it was interesting to see how designing for inclusion can help develop better products for all. For example, our first design user testing, we asked our testers to just use the device without telling them the constraints that the primary user has. After getting their feedback, it was fascinating to see how their critique was very similar to what Elaine later told us, reinforcing this idea that inclusive design is not just helping those with disabilities, but makes anyone’s experience better.

In the end, our team was able to truly reflect on how objects and interaction around us influence our ability to participate in society which should give equal opportunities to everyone. The people who design those participations are often the ones who decide who can and cannot participate based on abilities and other factors. As students, we should start to raise awareness to this issue so that future products truly include all abilities. If we were to do this project again, a big thing we would love to do is do more research on inclusiveness and accessibility to enhance our devices experience even further, because that knowledge is something that we still have gaps on since we are unable to truly understand the experience of those with disabilities.

Next Steps

Based on the feedback we got from our final demo and from our overall team discussion, there are definitely aspects of our device that we would like to continue iterating and researching. Due to our remote situation, there are aspects of our device that we were unable to execute which includes physically building the device, material research, and user testing. In the future, we plan on looking into making the physical design of the device to be more flexible based on setting and user, which is something we would revise through physical user-testings to see any gaps that are not visible through the remote realm. In addition, we would love to explore potential materials that the product could be made out of so that it is comfortable to rest your arm on. Right now, the final rendering looks like plastic, but we would prefer to use another type of material that is more comfortable or adding cushion paddings onto the design. Another aspect that we have not quite yet approached was error prevention, so being able to understand what are the problems that users will face while using our device and how might we create simple solutions that they can use to execute on their own. Lastly, the software can only use one instrument at a time, but it can be changed manually through the digital synthesizer. We would like to look into other softwares that are mire flexible with the ability to switch musical instruments through our devices components and not within the actual music software. In the end, we understand that the constraints that we faced with this project were super helpful in helping us think in new ways to work with them. As a team, we agreed that this device is definitely something that we would love to further refine and get to Elaine once the pandemic gets better!

Technical details

For our technical details, we will display our TinkerCAD, codes (TinkerCAD Version and Music Software Version), and schematic drawings (TinkerCAD Version and actual device map). There are still some parts of our technical aspect that we would like to further refine as there are other ways to solve the problems we are looking at.

Link to the TinkerCAD project: https://www.tinkercad.com/things/5noXteuGj3d

Below, you will see our main TinkerCAD set up to get a better understanding of our breadboard and electronics set-up. Please keep in mind that our TinkerCAD is just a way to express the overall goal of our device, because the platform is unable to completely function the way that we want it to like missing certain parts and unable to do certain functions.

This is our main TinkerCAD setup.

Code

/*
* Project: Tone Flex
* By Jina Lee, Connor Mcgaffin, Sruti Srinidhi
* 
* The code below takes an ultrasonic distance sensor 
* and a potentiometer as input which are mapped to 
* appropriate ranges to get the pitch and volume
* of the note being played respectively. The pitch
* and volume is then sent in the MIDI format over the
* serial port to the other Arduino which then displays
* the signal it receives on the LCD display.
*
* Pin mapping:
* 
* pin   | mode        | description
* ------|-------------|------------
* 2       input         start button
* 3       input         stop button
* 7       input/output  ultrasonic distance sensor pin  
* A0      input         potentiometer
* A1      input         potentiometer for volume
*      
* Code for the ultrasonic distance sensor was taken and 
* modified from David A. Mellis's project
* http://www.arduino.cc/en/Tutorial/Ping 
*/

/* SENDER */

// Initializing pins
const int VOLUMEPIN = A1;
const int pingPin = 7;

void setup() {
  pinMode(VOLUMEPIN, INPUT);
  Serial.begin(9600);
}

void loop() {
  
  // Get duration between sending and receiving signal for 
  // ultrasonic distance sensor
  long duration, cm;
  pinMode(pingPin, OUTPUT);
  digitalWrite(pingPin, LOW);
  delayMicroseconds(2);
  digitalWrite(pingPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(pingPin, LOW);
  pinMode(pingPin, INPUT);
  duration = pulseIn(pingPin, HIGH);
  // Covert duration to a distance which is used as pitch value
  int pitchval = microsecondsToCentimeters(duration);
  
  // Read potentiometer for volume
  int volumeval = analogRead(VOLUMEPIN);
  
  // Map pitch and volume to appropriate ranges
  int tone = map(pitchval, 0, 340, 1, 127);
  int volume = map(volumeval, 0, 1028, 1, 127);
  
  // Send MIDI play message
  MIDImessage(144,tone,volume);
  delay(500);
  // Send MIDI pause message
  MIDImessage(128, tone, volume);
  delay(500);
}

// Send MIDI signal in appropriate format over serial
void MIDImessage(int command, int MIDInote, int MIDIvelocity) {
  
  Serial.print(command);//send note on or note off command 
  Serial.print(',');
  Serial.print(MIDInote);//send pitch data
  Serial.print(',');
  Serial.println(MIDIvelocity);//send velocity data
}

long microsecondsToCentimeters(long microseconds) {
  // The speed of sound is 340 m/s or 29 microseconds per centimeter.
  // The ping travels out and back, so to find the distance of the
  // object we take half of the distance traveled.
  return microseconds / 29 / 2;
}
/*
*Pin mapping:
* 
* pin   | mode   | description
* ------|--------|------------
* 2       output    LCD - DB7
* 3       output    LCD - DB6
* 4       output    LCD - DB5   
* 5       output    LCD - DB4
* 11      output    LCD - E
* 12      output    LCD - RS
*      
* Code to receive serial data completed 
* with the help of Professor Zacharias
*/

/* RECEIVER */

#include<LiquidCrystal.h>

LiquidCrystal lcd(12, 11, 5, 4, 3, 2);


int numReceived;
int secondNumReceived;
int thirdNumReceived;


void setup(){
     Serial.begin(9600);
    lcd.begin(16, 2);
 
}

void loop(){
  // if data has been received on the software serial
  if (Serial.available()){
    // read it and save into numReceived
    numReceived = Serial.parseInt(); 
    // read second number and save it into secondNumReceived
    secondNumReceived = Serial.parseInt(); 
    // read third number and save it into secondNumReceived
    thirdNumReceived = Serial.parseInt();
 
  }
  // Write to LCD
  lcd.clear();
  lcd.setCursor(0,0);
  if (numReceived == 144){
  	lcd.print("Play "); 
  }else {
    lcd.print("Pause ");
  }
  lcd.setCursor(0,1);           
  lcd.print("Pitch= ");
  lcd.setCursor(7,1); 
  lcd.print((String)secondNumReceived);
  
  lcd.setCursor(7,0);           
  lcd.print("Vol= ");
  lcd.setCursor(12,0); 
  lcd.print((String)thirdNumReceived);
}

Schematic and design files

For our schematic drawings, we created two. The first one represents our mapping of components that our actual devices uses and the second version was what our TinkerCAD schematic is. We purposely made both versions to verify and make sure that our overall interactions make sense are are able to handle one another. These schematics were made while we were designing our device, which we found extremely helpful as the map aided to know if there was some type of electronics that would not make sense and not work. The difference between the drawings are that the TinkerCAD drawing has an LCD.

Adding on, for our device mockups, we used SkecthUp.

Here is the file for our renderings: Design SkecthUp Files

This is our main device schematic drawing!

Here is our TinkerCAD schematic drawing.

]]>
Remote Water Adjustment Variable Enabler (RWAVE) By Brenda Seal Team 1: Final Documentation https://courses.ideate.cmu.edu/60-223/f2020/work/remote-water-adjustment-variable-enabler-rwave-by-brenda-seal-team-1-final-documentation/ Thu, 17 Dec 2020 20:42:12 +0000 https://courses.ideate.cmu.edu/60-223/f2020/work/?p=12098 Introduction:

Our final project for the class IDEATE: Introduction to Physical Computing, is a Remote Water Adjustment Variable Enabler (RWAVE). We made this assertive technology for Brenda Dare. We interview Brenda over zoom about her life and asked about what things we could make for her and what daily tasks does she have a lot of difficulty doing. After the interview our team decided that we could make an assistive device that would help Brenda turn on the faucet in her shower without another person helping her.

See here for information about our initial interview with Brenda: https://courses.ideate.cmu.edu/60-223/f2020/work/brenda-seal-team-one/

See here for information about our prototypes: https://courses.ideate.cmu.edu/60-223/f2020/work/brenda-seal-team-1-prototype-documentation/

 

What We Built:

Our project allows a user to turn the faucets in their shower without having to physically turn them. A user can turn the faucet then using a remote control that controls the faucet knobs like how a tv remote controls a tv. Our projects has motors that controls the temperature and pressure knobs in Brenda’s shower and that is able to receive information from a remote. The remote is able to send information to the faucet controller based on the buttons the user presses. For example if a user presses the button to increase the temperature of the shower water on the remote the motor attached to the temperature knob will move to reflect that and the temperature of the water will increase. The purpose of this device was to allow Brenda to change the temperature and the pressure of her bath water independently. Without this assistive technology Brenda would not be able to bath independently.

Final Appearance:

The way the electronic would look for the remote control

Shows the final design of the remote controller in Solidworks from the front

Shows the remote controller from a side view in SolidWorks

This video shows how the remote would control the motors to change the temperature and pressure of the bathtub.

 

TinkerCad Interaction: 

Play the following video at full volume to hear about what is happening. Furthermore the output of this simulated remote is different color light on a neopixel strip. This is the simulated output because on TinkerCad is was impossible to send an output signal using an IR light as would be done if this remote was to be actually made.

Physical Mockup:

Demonstrates how the remote, that was resized after the prototype, would fit in someone’s hand

Demonstrates some of the changes made from the prototype of the remote on the left to the final version on the right

Shows how a user would turn on the remote

Demonstrates what the remote would look like when it is turned on

 

In addition to the remote models, there was also a proof of concept physical model that was made:

This is the physical prototype proof of concept.

This was the only physical element that was created for the motor because the rest of the semester was dedicated predominantly to electronic software design and development of the other parts of the project.

 

 

Narrative Sketch:

After a long day at work, dealing with many clients, Brenda decides that she wants to come home to take a bath. Due to Brenda’s condition, she requires assistance from a machine to make her way into the bathtub. After being situated on the opposite end of the bathtub from the faucets, she grabs a remote hanging from a string suction-cupped to the wall and presses the power button. A red LED turns green, and a small screen turns on denoting the pressure and temperature both set to zero. She pushes the temperature buttons to set the temperature to a steamy 90*F. After the perfect temperature, determined by angle and prior testing for now, she starts to press the pressure button, turning the water all the way up to 100 to fill up for a while. Simply turning down the pressure back to 0, then pressing the power button to turn off the system.

How We Got Here:

 

So, as a refresher, this first picture shows the first ideation for the remote control that we would be manufacturing to fit our needs. This ideation is meant to function more as the “feels like” aspect in order to make the device in the shape that would be easiest to use for someone who might not have the same motor capabilities as someone who does not have Cerebral Palsy. This is the base of where we started. These sketches help influence the first prototype that was 3D printed for the actual touching and holding.

Sketches of possible layouts for the remote

 

This prototype had a lot of great features: clear buttons and easily identifiable sections because of the letters. Because we are trying to make this device as something that could potentially be marketed, one of the thoughts and considerations made post designing and building the second device would have been adding braille to the remote’s cover. It would not be difficult to 3D print, however, this would have been post-Thanksgiving hiatus from in person for those who decided to return home.

Mimi’s roommate holding the 3D printed remote while giving feedback about its design. Demonstrates how the remote fits in someones hand

After all of the revisions, the remote was a little bit longer and a little bit skinnier in order to accommodate people who have shorter thumbs and are unable to reach all the way across the remote in order to press a button. In hindsight, there were a couple of changes that could have also been made. It could have taken a more ergonomic fit to the hand, enabling the easiest means of pressing the buttons. There could have also been buttons on the backside of the remote to make it more space efficient. A lot of these things could have been added onto if only we had put more thought into how Brenda, or any other person, might end up using the device. It should be mentioned that these could have been accomplished much easier if we were able to be in person and having Brenda “use” the remote, but alas, here we are. 3D printing would definitely be the way to go in order to make small sets for a small number of individuals; however, if this were going to hit the markets, there would probably be better manufacturing methods to produce these cheaper.

 

The following was derived as a means of creating a circuit that could be turned into a PCB. When we thought about designing our own remote, we thought it would be best to program it ourselves as well. The IR remote that was provided by the kits was unreliable and difficult to press. The buttons were sometimes difficult to tell if they had been pressed or not, so there would be a change of button choice in order to provide a tactile and auditory feedback for the user, in addition to a visual in order to cover all bases.

Testing voltages on the remote control simulation to figure out where the problem is

 

The next few images are going to be looking at the housing and the electronics that go into the actual device that the remote is going to control. All of the components were decided by the type of circuit that is shown in the TinkerCAD drawing later in the post.

 

This picture shows the EAGLE CAD sketch that is the start for the PCB that could be placed inside fo the box to allow for continuous use similar to a tv, potentially enabling wireless use.

This is the sketch that outlines all of the parts that go onto EAGLE to generate the PCB.

 

The following are the designs that were iterated through. Each one changes slightly from the last. At this stage in the project, the design was not fully fleshed out for manufacturing. The hypothetical trumped the need to physically build due to the nature of the presentation. Should there be more time, or we were able to go back and do this over, it would have made more sense to try and make the physical in order to help influence the later iterations of the designs. There could also be more consideration for material and piece design in order to manufacture on a large scale should this be intending to be commercialized.

 

These are the box designs that changed to optimize the space that was being used. It was rather difficult because it was all hypothetical and there was not a physical.

 

Overall, the plan was followed pretty closely. If anything, the hardest part was communicating with each other about potential ideas and design stuff. Either we were both busy or the platforms created difficult means of communicating effectively. It was also busy with preparing for the final push for finals, so some of the tasks might have been shortened in order to compensate the time. There is more research that can be done for this project, but it is very sastisfactory for the situations and the resources available.

Conclusions:

After the final product presentation we got a lot of positive and constructive feedback from many people. One piece of feedback said, “I think the idea is really great and am excited to see how it will turn out when physically built!” This feedback showed us that the idea of our idea was really valuable and our product could be really useful if it were to be physically built. Another piece of positive feedback we received was, “I appreciate you guys kept iterating and iterating to arrive at the best design you thought of. I think this could be such a useful and practical device for so many groups of people.” This piece of feedback also mentions that our idea would be really useful for many different people. They also commented on the fact that through our iterations we were able to improve the parts of the project we were each working on.

Feedback from a different person was, ” I’d be excited to see how this idea would play out as a physical device! I think the way that you explored the different CAD models to have a better, more universal fit to accommodate for a wider variety of baths shows a lot of great consideration.”  This piece of feedback comments about how we made our CAD models universal. From this comment and our discussions after the final presentation it seems like it would be very useful if after we made the final product in real life for Brenda we altered our design so that we could have different standard sets for different types of bathrooms. We made some of our design considerations based on Brenda’s shower setup but there are a lot of different types of bathroom faucets. So in the future it would be nice to make our design more universal or have different subsets of our product for different types of bathrooms.

A final piece of feedback we received was “The device captures the issue that was presented. I would’ve liked more emphasis placed on materials because waterproofing is so critical.” This comment raises the issue of waterproofing our product. We thought about this throughout the design process but did not spend a lot on time on it. We both spent most of our time making sure electrically the product worked and the other parts of the design were good. It was also harder to do this aspect with everything being virtual because we were not really building anything physically.  In retrospect we could have devoted more time to making sure everything would be waterproof because this is really an essential part of making sure the product works and is safe to use. Furthermore, going forward with this project making sure everything worked in a water environment would be our next concern. 

Overall working remotely this semester for this final project was not as hard as anticipated. It was actually easier to find time to meet and work on this project due to lack of other time commitments like extra curricular activities. It was harder to communicate effectively and understand other people over virtual platforms like zoom. It was also harder to work on the project while not being in the same physical space. In the end we kind of had two separate projects that coexisted to form a final project rather than just have one final project. On the other hand this also made the project easier to work on because we could each work on our own part without needing to schedule time with the other person or wait for the other person to finish their part. Lastly, it was overall disappointing that we couldn’t physically make the final project. Making all the plans and simulating the project was the best that could be done because nothing compares to all the learning done when actually making the product.

We found it vey rewarding to work with Brenda and learn about her life. We enjoyed being able to use the skills we have gained in this class to work on a project that would allow Brenda to be more independent. Furthermore from the feedback it seems like this would be a really useful product for make people and would allow them to live more independently. Hopefully in the future we will be able to physically make this product for Brenda and other people who would benefit from it.

Finally, we really enjoyed working on this product. Going forward, as stated above, we would want to make our product more universal so it could be used in bathrooms with many different types of faucets and make more waterproof design considerations. Thank you Professor Zach for all the help on this project and throughout the semester.

Technical Details:

Mimi:

TinkerCad Link

Final electronic setup in TinkerCad

Remote Code:

/*
 * Remote Water Adjustment Variable Enabler (RWAVE)
 * Remote Code
 * Mimi Marino (mmarino)
 *
 * Description: This is the code to control the remote control.
 * There is a button to turn the remote on and buttons that 
 * correspond to increasing and descresing the temperature 
 * and the pressure respectively. The neopixel strip is there 
 * to simulate the signals that the remote would output 
 * because you can't use an IR light as I would if I were
 * to make this in real life. Note: it is intentional that if 
 * you hold down the button the light stays on.
 * 
 * pin   | mode   | description
 * ------|--------|------------
 * 1      input     button PRESSURE_DOWN  
 * 2      input     button PRESSURE_UP
 * 3      input     button TEMP_DOWN
 * 4      input     button TEMP_UP
 * 5      output    neopixel ring
 * 
 *
 * Credit : Used this tinkercad link to help program the 
 * attiny. 
 * https://www.tinkercad.com/things/d6fJABMd27t-attiny-pwm
*/

#include <Adafruit_NeoPixel.h>
const int PIXEL_PIN = 5;
const int NUMPIXELS = 12;
Adafruit_NeoPixel pixels = Adafruit_NeoPixel(NUMPIXELS, PIXEL_PIN, NEO_GRB + NEO_KHZ800);

const int TEMP_UP= 4;
const int PRESSURE_UP = 2;
const int TEMP_DOWN= 3;
const int PRESSURE_DOWN = 1;

void setup()
{
  //Sets Neo pixel Strip
  pixels.begin();
  pinMode(PIXEL_PIN, OUTPUT);
  pinMode(TEMP_UP, INPUT_PULLUP);
  pinMode(PRESSURE_UP, INPUT_PULLUP);
  pinMode(TEMP_DOWN, INPUT_PULLUP);
  pinMode(PRESSURE_DOWN, INPUT_PULLUP);

}

void loop()
{
  //checks if TEMP_UP is being pressed
  if (!digitalRead(TEMP_UP)){
    //changes color to RED
    for (int i=0; i < NUMPIXELS; i++) {
        pixels.setPixelColor(i,255,0,0);
        pixels.show();
    }
  }
  //checks if TEMP_DOWN is being pressed
  if (!digitalRead(TEMP_DOWN)){
    //changes color to BLUE
    for (int i=0; i < NUMPIXELS; i++) {
        pixels.setPixelColor(i,0,0,255);
        pixels.show();
    }
  }
  //checks if PRESSURE_UP is being pressed
  if (!digitalRead(PRESSURE_UP)){
    //changes color to PURPLE
    for (int i=0; i < NUMPIXELS; i++) {
        pixels.setPixelColor(i,128,0,128);
        pixels.show();
    }
  }
  //checks if PRESSURE_DOWN is being pressed
  if (!digitalRead(PRESSURE_DOWN)){
    //changes color to GREEN
    for (int i=0; i < NUMPIXELS; i++) {
        pixels.setPixelColor(i,0,255,0);
        pixels.show();
    }
  }
  // if no button is being pressed
  else {
    //delay to make tinkercad work better
    delay (50);
    	//turns off neopixel strip
        for (int i=0; i < NUMPIXELS; i++) {
        pixels.setPixelColor(i,0,0,0);
        pixels.show();
    }
  }
  
}

Remote Schematic:

Schematic for the remote

Remote CAD Files:

Remote Design Files

(Includes STL file, SolidWorks part file and SolidWorks drawing file)

Carl:

Basic proof-of-concept design for the mechanism meant to show the remote adjusting the motor and changing the LCD.

 

TinkerCAD Link 2

Motor Code:

// Libraries
#include <LiquidCrystal.h>
#include <IRremote.h>
#include <Servo.h>

// Servo Pins
#define PRESSURE 6

// LCD Pins
#define regsel 13
#define enable 12
#define d7p 11
#define d6p 10
#define d5p 9
#define d4p 8

// IR Pins
#define iroutput 7

// LED Pins
#define offLED 5			// Red
#define onLED 4				// Green

IRrecv irrecv(iroutput);
decode_results results;
LiquidCrystal lcd(regsel,enable,d4p,d5p,d6p,d7p);
//Servo tempServo;
Servo presServo;

// Variables
//unsigned int temp = 0;
unsigned int pres = 0;
bool start = 1;


void setup(){
  Serial.begin(9600);
  irrecv.enableIRIn();
  pinMode(offLED,OUTPUT);
  pinMode(onLED,OUTPUT);
  while (start == 1){
    digitalWrite(offLED,HIGH);
    digitalWrite(onLED,LOW);
    if(irrecv.decode(&results)){
      unsigned int active = results.value;
      if(active == 255){
        start = 0;
        //Serial.println("HERE.");
        break;
      }
    }
  }
  lcd.begin(16,2);
  presServo.attach(PRESSURE);
  //tempServo.attach(5);
}

void loop(){
  digitalWrite(offLED,LOW);
  digitalWrite(onLED,HIGH);
  if(irrecv.decode(&results)){
    //Serial.println(results.value,HEX);
    unsigned int button = results.value;
    //Serial.println(button);
    setVal(button);
    int newPres = map(pres,0,100,0,180);
    presServo.write(newPres);
     displayPres();
    irrecv.resume();
  }
  onOffStatus(start);
}

// Changes the values based on pressed button
void setVal(unsigned int buttonPress){
  switch (buttonPress) {
    case 24735:
      pres++;
      break;
    case 8415:
      pres--;
      break; 
    /*case 255:
      start = 1;
      break;*/
  }
}

void displayPres(){
  lcd.setCursor(0,0);
  lcd.print((String)"Pres:"+pres);
}

void onOffStatus(bool temp){
  //Serial.println(temp);
  if(temp != 0{
    lcd.noDisplay();
  } else {
    lcd.display();
  }
}

 

Circuit Lab schematic of the Motor assembly using Servo Motors.

]]>