Prototype documentation – Intro to Physical Computing: Student Work https://courses.ideate.cmu.edu/60-223/f2018/work Intro to Physical Computing: Student Work Sun, 16 Dec 2018 16:38:47 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.25 Team MARY(a) Prototype Documentation https://courses.ideate.cmu.edu/60-223/f2018/work/team-marya-prototype-documentation/ https://courses.ideate.cmu.edu/60-223/f2018/work/team-marya-prototype-documentation/#respond Tue, 20 Nov 2018 19:09:31 +0000 https://courses.ideate.cmu.edu/60-223/f2018/work/?p=5092 Introduction:

Our prototype of the Automatic Filing Machine! It is made of laser-cut cardboard and wooden dowels. The tray holding the paper is intended to move up and down to the correct shelf for filing.

A close-up of our paper tray. We did not have time to make the real-life representation of it, but the papers would have a conveyor belt underneath to move the papers off of the tray and into the shelf.

A very simple representation of our user interface: a button panel. Each button would be labeled with the shelf that it goes to, and maybe color coded to aid visual recognition.

We are team MARYa (Mohan, Andrea, Roly, and Yingyang), four students at Carnegie Mellon University that are assigned with the task to create a useful implement for our older friend, Maria. Previously, we had documented our meeting with Maria to find out what could be useful for her. In this stage, we took our idea and turned it into a prototype for Maria to see and critique at our next meeting with her, which was on November 13th. We found in our initial meeting with Maria that she needed was throwing her papers on the floor instead of filing them, so we tried to come up with a product that would organize her papers for her.

Product: An Automatic Filing Cabinet

Our prototype demonstrates a cabinet that files itself. Although it is completely a “behaves-like” prototype (it has no code yet), it does represent what our team envisioned for the product.

We want Maria to be able to put papers that she needs filed onto the tray of our product, press a button, and let the machine do the rest. The machine will determine where the papers go based on the button pressed, which corresponds to a shelf. The tray will move to that shelf, and a small conveyor-belt-like mechanism will push the papers onto the shelf. The tray would then return to the top of the machine, waiting for more papers. The buttons are located on top of the machine for easy access, and the front of the shelves are open to allow the user to take the papers they need out.

Process:

The origin of our idea: Maria’s very populated filing cabinets. She has a publishing business, and thus has massive amounts of paperwork.

A visual inspiration for the proportions and materials of our shelf. We like wood as a main material, since it would fit in well with Maria’s home. We also like the strength and customizability of it.

One of our most important meetings before the critique. Here Roly is explaining his vision of the mechanism for raising and lowering the paper tray.

One of the mechanism inspirations we found during image research. We will be using a stepper motor to spin a big threaded rod, but this required attaching the stepper to it using a belt. This image shows how we will do that.

A quick drawing that we used to determine how big our shelves should be. We needed to strike a balance between accessibility for Maria and accessibility for the Machine itself.

Maria’s first glimpses of the prototype! She was very intrigued with how it would work and gave some valuable input on the materials we should use and how our interface should be positioned.

The machine in action at our prototype meeting! Maria is testing the height of our paper tray and the reachability of the buttons.

Discussion

Our first meeting with Maria was much later than the other groups’, so we were a bit behind everyone else. While the rest of the class was working on the third in-class session for their prototype, our group was meeting Maria for the first time. Because of this, we only had around 4 days to meet up and work on a prototype. We couldn’t add any of the actual motorized or mechanical features of the prototype, but we did get a full-scale cardboard version of the prototype.

The crit on November 13th was pretty useful to our group. We met with Maria, and showed her our prototype. She gave us some useful feedback that we will want to incorporate into our final product.

She asked us to put wheels under it so that it could be moved easily, which I thought was pretty useful because none of us had thought about that. We asked her feedback on the placement of the buttons, and she gave us useful feedback for that as well, saying if we put the buttons on top of the shelf she wouldn’t be tempted to put things on top of it. She also asked us to number the shelves instead of writing labels on it so that she can label them herself and change the labels if she wants to, which I thought was something we should definitely consider.

For our next steps, we have a ton of work to do. We have to order our parts, figure out the mechanical and software aspects of this, finalize our design, cut the shelf itself out, and basically everything else. It will be quite a challenge to do this in 2 weeks especially with Thanksgiving break coming up, but hopefully we will pull it off.

]]>
https://courses.ideate.cmu.edu/60-223/f2018/work/team-marya-prototype-documentation/feed/ 0
Team Joanne Prototype Documentation https://courses.ideate.cmu.edu/60-223/f2018/work/team-joanne-prototype-documentation/ https://courses.ideate.cmu.edu/60-223/f2018/work/team-joanne-prototype-documentation/#respond Fri, 16 Nov 2018 01:37:08 +0000 https://courses.ideate.cmu.edu/60-223/f2018/work/?p=5064

The full mechanism with all of the modular components shown.

The actuation is driven by a servo which turns a rack and pinion mechanism.

The servo of the switch is triggered by a push button. For the final device, the servo will be triggered by a photoresistor.

Video of device in action.

Our device turns “on” and “off” a model of Joanne’s linear toggle switch. The servo turns the gear 160 degrees and the rack meshed with the gear converts that rotation into linear actuation. A grip for the toggle switch handle is connected to the rack so when the rack moves, the switch is pushed or pulled with it.  The user pushes the button to activate the servo.

Process

We first needed to take a bunch of measurements of the toggle switch at Joanne’s house.

After we had all the measurements we needed, we drew some sketches of the switch box and started generating ideas for what we were going to make to move the switch.

We decided on a rack and pinion mechanism driven by a servo for the actuation. This is the first thing we modeled in SolidWorks. The toggle switch grip is mated to the backside of the rack and we made a 4 bolt pattern on the base to attach it to the switch cover that we will also model.

We didn’t originally plan on designing a model of the toggle switch box but we decided that having a 3D model of it would make designing our mechanism easier instead of just going off a two dimensional sketch.

We mated the rack and pinion mechanism with the toggle switch box in an assembly to see where and how our device needs to attach to the box. After doing this, we realized that the 4 bolt holes on the base of our mechanism isn’t the best method of attachment.

We designed the box cover which will be the way our rack and pinion mechanism is attached to the toggle switch box. The base was removed and a segment of the lateral wall was extruded further for the 4 bolt pattern. The picture shows a full assembly of the box cover over the toggle switch box and the rack and pinion mechanism mated to the box cover. Time to print!

The print came out clean but still required a little bit of sanding to allow smooth movement of the rack and pinion. Most of the parts were connect by a screw. The picture shows a complete assembly of the device that will be attached to the toggle switch box.

Our device easily slipped onto the toggle switch box and was ready for testing.

We first uploaded some basic servo example code to test the movement of our device. We achieved the full range of linear motion that we calculated for the actual toggle switch box at Joanne’s house. After this all we had left to do was implement a way for the user to trigger the servo.

Discussion

Major CHallenges

In the previous week we gained and subsequently lost a group member. This caused a communication error and the prototype had to be given a button last minute so it could be activated during the interview. This complicated the explanation of the system to Joanne because the presence of a button implied that a button would activate a device. It caused a bigger communication error, as the most important part of the process is that a photocell inside of a front door lamp is the trigger to start the Arduino. So there is no additional button we are adding to her home, she flips the switch inside of her house and the front lamp goes on telling the Arduino to start. The presence of the button confused Joanne so it took longer to explain to her the system. The only challenge with designing our mechanism was that we did not take precise measurements of the toggle switch box at Joanne’s house so for many of our dimensions we had to estimate it. These estimations stacked up sometimes and caused certain features to be misaligned.

The Crit

Overall the crit was a good opportunity to reassure Joanne about the power source and the visibility of the wires. Her commentary mostly revolved around aesthetic of the system and concern about the dangers of the electricity. After we thoroughly explained the system and she understood exactly what was happening she was reassured. We didn’t spend much of the crit critiquing, it was mostly explaining in detail how the system would work for Joanne.

Joannes main concern was the visibility which led to a conversation about the wires and voltage. After Zack reassured her we set a voltage goal of 12 volts as we move forwards. We bonded more with Joanne and instilled a solid trust but beyond that our system is relatively straightforward and there wasn’t too much new. It reminded us of the importance of the weatherproofing but we had already been thinking about that.

We will be ignoring none of her concerns as they were all quite valid, all revolving around visibility and electricity.

Planned Next steps

We will need to guarantee the servo switch is waterproof so we are planning on finding the waterproof servo. From there we will double check with Joanne that she wants the LEDs wired into the system, she made a few comments that could have one of two meanings, concern for the visibility, and/or a desire to not have LEDs. So we will reach out to her for a decision. Aside from all of the small details we will be powering through as we have a lot of voltage and wiring problems left to tackle. Also, in order to assure that our device perfectly fits to the toggle switch we will make another trip out to Joanne’s house with a caliper and take more exact measurements. 

]]>
https://courses.ideate.cmu.edu/60-223/f2018/work/team-joanne-prototype-documentation/feed/ 0
Team Weavers Prototype Documentation https://courses.ideate.cmu.edu/60-223/f2018/work/team-weavers-prototype-documentation/ https://courses.ideate.cmu.edu/60-223/f2018/work/team-weavers-prototype-documentation/#respond Thu, 15 Nov 2018 19:24:51 +0000 https://courses.ideate.cmu.edu/60-223/f2018/work/?p=4955 INTRODUCTION

For the final project of Introduction to Physical Computing, The Weavers, which consists of Jenny, Megan, and Ghalya, was assigned to create an assistive device for Rebecca Herbert. Rebecca loves weaving. For her, the most tedious task in weaving is winding the yarn into a ball because it hurts her wrist and she cannot tell how much she winded. This documents the initial prototype of a motorized ball winder that would save Rebecca from doing it herself.

PRODUCT

Ball winder in action.

Full shot of prototype.

User setting the amount of yarn that the ball winder will wind.

We created an automatic ball winder; when weaving, the weaver must initially take the yarn they have and turn it into a form that holds its shape well and is easy to use. Most people do this using a ball winder, which requires the user to turn a knob around until the yarn has gotten into a ball. This step can take Rebecca hours, leading her to get greater pain in her arms, and discouraging her from starting new projects. We have set out to create a ball winder that automatically winds the amount of yarn that the user wants without having to manually turn a crank.

PROCESS PHOTOS

This is a ball winder that Rebecca uses regularly to ball her yarn by hand; we used it as a guide on how to make an automatic version.

These are two of the original schematics we made for our ball winder; it also shows some of our preliminary circuitry.

Testing Individual Components of Prototype

Top Surface of the Device with Prototype String Feeders

Diagonal string winding pattern of the prototype.

Testing different heights of to feed the string into the prototype.

Prototype spinning while collecting yarn.

This shows what our ball winder looked like when in use with Rebecca’s umbrella; this is how it would be used realistically in her house.

This is a video where we tried comparing the movement of our ball winder with the movement of the sample one Rebecca has to see how it moves differently.

Finished Display Screen

Our crit and talk with Rebecca.

DISCUSSION

Initially, the major problems were trying to figure out how to work the LED screen and the motor; we spent a while working on them and wound up going to office hours for help. After we assembled all the parts, we discovered that the yarn pattern on our ball winder is wrong; instead of forming a criss-cross shape that holds its form well, our ball winder creates a diagonal shape where the yarn goes over itself and doesn’t hold its shape. We spent a while testing different reasons why this may be by holding the yarn at different heights, changing the speed, and extensively comparing our design with the design of the manual ball winder we were given. We were told by Rebecca that the yarn pattern is not the most important step- so going forward we will work on perfecting our distance measuring before focusing on the yarn pattern more.

The critique on November 13th was extremely helpful in the process because it allowed us to touch base again with what Rebecca envisioned for her device. As we worked on the prototype, the string winding pattern was particularly challenging for us and we spent most of the time trying to mimic the exact winding pattern of a manual ball winder. However, after the critique and Rebecca’s opinion, we realized that the most important thing to her was to find the length of the yarn wound on the ball, not the yarn pattern of the ball. While the yarn pattern is helpful in keeping the shape of the ball after taking it off the ball winder, it is not essential to the project.  Rebecca also reminisced that her grandmother would wind yarn around her hands or between a person’s arms, which results in the same winding pattern as our prototype currently has. Thus, moving forward we will be focusing mainly on calculating an accurate distance that the yarn was wound, and try to solve the yarn pattern if time allows.

Another thing that we realized during the critique is that the design of the automatic ball winder must be more user-friendly. During the critique it did not function as intended initially due to some wiring coming undone; we had to make sure that it was plugged in. The button that we were pressing on the rotary encoder wasn’t easily pressed when placed in the box as well, so that needs to be fixed. In general, we want it to be easier for the user to input the amount of yarn they want to wind and to set up the ball winder.

The next steps of this project would be to implement accurate distance measuring functionality alongside a intuitive user design. This would entail measuring the string on a wheel before the yarn is put into the ball form. The user would loop the yarn into a feeder hole which gives the yarn to a rotating disk with a set size. The number of rotations is used to calculate the total distance traveled and therefore the length of the yarn. Then, the thread is fed through another feeder hole and wound into the ball. Currently, there is a screen with a dial that the user can use to input the amount of yarn they would like and press the dial starts the winding process. The screen also alerts the user how many yards are have currently been wound. To make the process easier and to give more functionality for emergency stops or different settings, a keypad would offer more options for the user to have. The main focus for the future of this project is to gain an accurate reading of the yarn measured as well as make the process of using the product as streamlined as possible.

]]>
https://courses.ideate.cmu.edu/60-223/f2018/work/team-weavers-prototype-documentation/feed/ 0
Team Jeffrey – Prototype Documentation https://courses.ideate.cmu.edu/60-223/f2018/work/team-jeffrey-prototype-documentation/ https://courses.ideate.cmu.edu/60-223/f2018/work/team-jeffrey-prototype-documentation/#respond Thu, 15 Nov 2018 18:05:36 +0000 https://courses.ideate.cmu.edu/60-223/f2018/work/?p=5009

 

Introduction

This post is a detailed explanation of our team’s process towards a functioning behave’s-like prototype. After our first conversation with our older collaborator Jeffrey, we decided to make a bingo-style game for him to play with his 6 year old granddaughter, Stella. This game, partially disguised as trivia, would be used as a tool to spark conversations about interests and personal topics as a way to get to know each other.

 

 

Part one of Prototype Demo

Part two of Prototype Demo

Product

This game is essentially a mix of the core ideas behind trivial pursuit, bingo and truth or dare. Each user comes up with a set of questions which are personally relevant to them at the beginning of each game, and then switch stacks. This allows for it to be analog and generative, a constant change of questions and induced conversation between each game. Player A picks a card from player B’s stack, reads the question and attempts to answer it. Example questions could be: Who is my favorite artist? What are the lyrics to the star spangled banner? How did I meet my wife? What do I want to be when I grow up? What is Taylor Swift’s new hit song?

 

If answered correctly, player A gets to choose a box, which will illuminate to show the person’s color. Player A may then press the “Correct” button, to forward the turn to player B.

 

If answered incorrectly however, player A must pick a penalty card. Penalty cards require you to complete a certain task, such as inventing a funny song on the spot about a given sentence, or telling a story as a response to a prompt. When the story is completed, player A presses “Incorrect” and forwards the turn to player B. No buttons light up.

The buttons serve as a function to alternate turns and keep count of score.

The first player to reach five boxes in a row wins the game.

 

Process

 

We started by brainstorming ideas to make Jeffrey’s life easier or more pleasurable. He did not seem to need anything fixed, so we started looking into ways of enhancing good things in his life. It became clear very quickly that he kept gravitating towards his granddaughter, and the time he and his wife get to spend with her. We additionally noticed that Jeffrey might not perfectly know how to interact with a six year old, often letting his wife entertain her. We therefore decided to create a game with trivia elements, which match Jeffrey’s interests, and a simple and colorful button interface, with a changeable game play so the game can be adaptable as Stella grows up.  

 

Preliminary sketches of our game board

 

We started by creating a game plan, which would be fitting for both a 6 and 73 year old.  This ended up being on of our biggest challenges, as none of our group members had any game design experience.

 

Working on conceptualization and physical design/implementation of the game board.

 

After settling on a concept and game play, we started working on layout and board design, option for big buttons with fun interactions, which would make a ‘trivia’ game more interesting to a 6 year old.

Setup of  our first button pad trial. We are currently waiting for the RGB version of the pads, which should arrive in the mail in a week or two.

 

We then started programming the colored buttons, LCD screen and button pads separately and finally got together to merge our individual codes.

Button Pad are able to light up according to the color of LED beneath it.

The LCD screen welcoming Jeffrey and Stella to their custom game.

 

J and Ca working on merging code sections.

 

Last Tuesday, we were lucky to be able to meet with Jeffrey again to discuss our idea and game implementation. We received a lot of design feedback. He seemed very excited overall at the idea of being able to play this new custom game with Stella.

 

Jeffrey, Ca and C discussing game intricacies and board aesthetics.

 

 

Discussion

As mentioned prior, because our team is so diverse and complementary in skill set, our main challenge was creating an interesting and fun game for such a wise age gap. Making a game intuitive and worth playing was a hard thinking exercise considering none of us had any game design experience.  We ended up settling on a malleable and analog game, which would allow Jefferey and Stella to change the questions by hand as Stella grows older. We additionally tried to make the  interface loose so Stella can invent her own rules if she one day wishes.

 

November 13th critique:

Jeffrey seemed very excited by our prototype and believed he and Stella would have fun playing the game. He gave us a lot of input when it came to the aesthetics of the board, suggesting we should make it out of clear acrylic so that Stella could see the colorful wiring on the inside. Some other fabrication advices he gave us include building a box to hold question and penalty cards.

The crit was useful for our team and allowed us to move confidently forward with our design. Jeffrey in particular seemed happy about our choices in tactility and color, and even suggested we add sound (which is something we had been shying away from because we thought it might be irritating Jeffrey and his wife).

Over the next few weeks, we will be starting fabrication while waiting for the RGB button pads. We will working on making a clean and durable product, which will hopefully bring joy to Jeffrey and Stella for a long time.

]]>
https://courses.ideate.cmu.edu/60-223/f2018/work/team-jeffrey-prototype-documentation/feed/ 0
Team Joseph Prototype Documentation https://courses.ideate.cmu.edu/60-223/f2018/work/team-joseph-prototype-documentation/ https://courses.ideate.cmu.edu/60-223/f2018/work/team-joseph-prototype-documentation/#respond Thu, 15 Nov 2018 17:59:35 +0000 https://courses.ideate.cmu.edu/60-223/f2018/work/?p=4944 Introduction

After our initial meeting with Joseph, our design client, our team set to work on developing a prototype that focused on alleviating some of Joseph’s freezing symptoms as a Parkinson’s patient. Our prototype is a detachable cane accessory that features an LED light as a flashlight, a horizontal-line laser as a visual cue, and a mini vibrator as a haptic cue.

For more initial background on our project, please see our previous post: https://courses.ideate.cmu.edu/60-223/f2018/work/meeting-documentation-2/.

Product

Our product is a detachable unit which clips onto any cane of our design client. In order to combat freezing episodes, or moments where Joseph loses his mobility, we have integrated a laser, which will project on to the floor ahead of him, when turned on through the use of a button. This laser will give Joseph a focal point to focus on, helping him break out of a freezing episode. In addition, the cane is equipped with a vibrating mini motor disc. When turned on via the same button as the laser, haptic cues are emitted at a consistent interval, subconsciously providing Joseph with a walking pace.  Lastly, on our prototype, we have an LED substituting for a flashlight, which Joseph can turn on to increase visibility. The flashlight will have a separate button for its operations. Fortunately, our prototype has all of the functionalities of our final product, but does require refinement within these functionalities.

 

cane in use

Overall details of prototype

the LED will be replaced by a flashlight at the end

Details of buttons: upper one is for the cues, and the lower one is for the light

Details of laser: we angled it for the ideal  distance between laser line and the cane

Process

During an in-class work session, our team sat down and processed our initial meeting documentation with Joseph, researched different Parkinson’s symptoms, and brainstormed what kinds of technological solutions we could come up with involving an Arduino.

The earliest design which involves a pressure sensor at the bottom, but after talking to Joseph about his needs, we decided to take out the feature

White board sketches of inputs and outputs

Whiteboarding our initial design helped us figure out how to design the hardware layout of the detachable cane accessory. We are smiling because prototyping is fun!

A very, very early prototype where we initially focused on getting different hardware components to work: the mini vibrator, the 3-5V red horizontal laser, controlled by two buttons.

We purchased a black vinyl-coated steel spring clip that would act as the detachable accessory for Joseph’s canes. We measured the diameter of an example cane to be 22.55mm, or ~0.88″, and deduced that 0.75-1.125″ clips would be sufficient for most canes.

Hardware before attaching to the cane, powered by coin batteries

For our prototype, we strapped electrical tape to hold together the breadboards, Arduino, battery and laser to the cane.

Discussion

One of the major challenges during our prototyping process was the hardware design. Since Joseph consistently collects new canes and uses a wide variety of different canes for different purposes, our team wanted to create a detachable hardware solution that Joseph could snap onto any one of his canes (rather than just “hardcoding” it to one particular cane). We spent a fair amount of time changing the sizes of the breadboard and rearranging our circuitry such that it fit on a small piece of wood, which was then attached to a metal clamp that we ordered off McMaster. Additionally, because this is a detachable solution, we needed to fit all of the hardware components ideally in a single small box, high up near the handle of the cane such that Joseph could push buttons to turn on/off the haptic/visual cues. We considered creating two components that were detachable: one for the buttons high up near the handle, and another lower down for the laser; however after some user testing Joseph, he verified that the laser (although high up near the handle) still cast a visible enough light for him to follow.

On Tuesday, November 13, our team demonstrated our working prototype to Joseph and the class. We received several pieces of interesting feedback from the class; Sana notably questioned whether this detachable device would be TSA-friendly. This is a valid concern, because Joseph travels frequently to visit his son who lives in San Francisco, and while the detachable component is more travel-friendly, we would not want our older friend to get into trouble with the TSA unnecessarily. We will be looking into TSA regulations, particularly with respect to batteries, since certain types of batteries are prohibited on carry-ons.

What was particularly helpful was the one-on-one debrief we had with Joseph after our class demonstration.

Joseph tested out our assistive cane prototype and provided valuable feedback on additional customizations he would like.

Some useful feedback that Joseph provided included: instead of having a horizontal laser that is perpendicular to his walking direction, he would prefer a vertical laser acts more like a forward “guide” pointing in the direction that he needs to walk. He said that he would likely respond better if the line was vertical. We plan on reorienting the laser such that it emits a vertical line instead.

As you can see, our prototype has a horizontal laser that is perpendicular to his walking direction. In this photo, Joseph discusses his vertical preference.

We also asked Joseph to provide feedback on whether he would be able to reach and physically press buttons to turn on the haptic and visual cues; in our prototype, one button triggers the LED “flashlight” and the other button triggers the haptic and visual cues (vibration and laser). One major design point that was brought up involved the placement of the button. Originally, our prototype had taped the button to the handle of the cane. However, given that Joseph has a multitude of different canes, which all have differently sized handles, attaching the buttons in this way would not be extensible for all of Joseph’s canes.

Joseph brought out one of his existing canes which has a flashlight attachment, which he bought off Amazon, and which he says that don’t work.

After discussing with Joseph on this matter, together we agreed that the buttons could be situated on the back of the detachable device’s box, because while his fingers are wrapped around the handle, he is still able to stretch them to comfortably press buttons a few inches away. Joseph also mentioned that he would like separate buttons to control the haptic and visual cues independently, as a matter of personal preference. From this critique, our team will create separate buttons for each cue, and place these buttons at the back of the detachable device’s box.

Talking to Joesph, he thinks we should get an A!

One thing that we learned from Joseph during our debrief is his planned usage of this detachable device. In his words, he said, “As the [natural] dopamine [in my brain] wears off, I need to take a synthetic dopamine pill.” During this time when his dopamine levels are low, Joseph is more prone to freezing episodes, in which he plans on using our device’s haptic and visual cues to aid him. He does not plan on frequently using this device every day, because he is concerned that the novelty of the cues will wear off and be ineffective in helping him get through freezing episodes.

Some other miscellaneous pieces of feedback that we plan on incorporating include:

  • An on-off switch to power on/off the detachable device,
  • A rechargeable battery or a battery that is compatible with TSA regulations
  • An adjustable flashlight head
  • Waterproof seal.

After our demo to the class, one of our other older friends, Maria (from another group), handed us a handwritten note. She was interested in our detachable device for her sister, who is also a Parkinson’s patient, and linked us to a company who may be interested in producing devices like these. (Email redacted for privacy purposes.)

For next steps, our team will focus on using a 9V battery (as opposed to the weaker 6V battery for our demo) and using a transistor to help power the laser, buttons, vibrator, and LED. The weaker 6V and 5V batteries we used could not supply enough current to power our device for a very long time. We also plan on using a smaller Arduino, like an Arduino Nano, to help reduce the size of the box that will contain all of the hardware components. Lastly, we will also begin looking into 3D-printing our box that will contain all the hardware components, so that it will be waterproof and customized. However, since none of our team members have experience with 3D modeling software, this may be a little out of scope. We also will work to incorporate Joseph’s feedback, like moving the buttons to the back of the detachable device box, using a brighter and larger LED, as well as ensuring our device is TSA-compliant.

]]>
https://courses.ideate.cmu.edu/60-223/f2018/work/team-joseph-prototype-documentation/feed/ 0
Team James Prototype Documentation https://courses.ideate.cmu.edu/60-223/f2018/work/team-james-prototype-documentation/ https://courses.ideate.cmu.edu/60-223/f2018/work/team-james-prototype-documentation/#respond Thu, 15 Nov 2018 08:36:17 +0000 https://courses.ideate.cmu.edu/60-223/f2018/work/?p=4975 Introduction

For the prototype, we started working on the memory aid that Jim identified would be a great asset in his life. Essentially, it would serve as a reminder as he’s rushing out the door to make sure that he has everything that he needs for his various meetings/that day. If you would like to read about how the initial meeting went where we identified this need, feel free to check out our documentation below.

Meeting Documentation

Product

Overall view of prototype

TFT Display and Motion Sensor

TFT Display connected to ESP32 (goal)

Our prototype demonstrates similar behavior to the final result we are trying to achieve as far as the hardware component.  The motion sensor detects movement so that the system can save energy. The touchscreen display features a button that allows the user to toggle between buttons under a list of everyday items for Jim. Changing the red button green means that the user (Jim) has answered yes to having all the items he needs and is ready to leave his house. The current prototype works with inputs (the items) dictated by the user but ultimately we would like to have the display connected to the ESP32 (image above) and for the ESP32 to communicate with Jim’s Google Calendar so that it can easily know what he’ll need for every day.  Unfortunately we didn’t receive the ESPs until Monday so we had to temporarily set everything up with an Arduino for this demo. The final product is intended to be stored inside a box next to Jim’s front door that can be portable.

Process

Whiteboard diagram back when the idea was to host a server using a raspberry pi instead of Google Calendar

Drawing out the display sizes we could find online with a marker and ruler to debate each option, taking into account price and later shipping times

Testing if the motion detector was working as well as the range by giving it a time out in the cubbies

Sketches of icons for the common items that Jim often needs

First attempt to connect tft display to esp32

Jumpers soldered for SPI use

Record of pins between display and esp32

Reminder screen that displays once motion detector is triggered with daily items listed out (day of the demo)

 

Process

One of the major challenges for us was connecting all the ESP pins to the TFT Display pins. One of our team members is currently working on tackling this issue but we didn’t have time to do it before the demo. The bulk of the problem is the ESP32 having way more pins than the Arduino and figuring out how those pins act based off the schematic included. The majority of the other pieces (calibrating the motion sensor, connecting the motion sensor to the arduino, researching APIs to incorporate Google Calendar) was fairly easy to troubleshoot except for the API data which requires more research to implement.

We received some great advice/ideas/suggestions from this crit. During the in class portion, we received a suggestion that we should assign items to recurring events so Jim doesn’t have to manually log the items in every time. As soon as we got time to talk to Jim one on three, we asked him to send us a list of common items that he needs whenever it’s convenient for him so we can specifically look in Google Calendar for the event name and assign the items as a result. He would still have to enter the items manually for new events, however. Zach came around and gave a critique that Jim may want to use his descriptions and it would be better if the code could parse for a keyword such as list: to start the list.  He also told us that he often abbreviates common events – such as GOM for Gathering of Men which will be useful for us to keep an eye out for when the Google Calender API is up and running.

We received some general aesthetic feedback as well that bigger text and a bigger screen if possible would be nice. He also told us that he would prefer to confirm he has all his objects at the end, rather than individually checking items as he goes. We also suggested there could be some sound from the box or a phone notification in case he bypasses the box without clicking cancel (something we have to implement for the final product). Jim gave us the helpful feedback that though his phone is often on him, he doesn’t always have it on so a phone notification might not be the best method. We also got into a handy conversation with Zach about using a 9V battery vs plugging the device into the wall. As a result, we now know to keep an eye out for the distance between the front door and the nearest outlet.

Our next steps for this project is to get the Google Calendar API up and running so that we can make sure everything is working right/design the user experience from the web portion of the project.  We also have to resolve how to get the TFT Display Screens to properly communicate with the ESP32. Lastly, we must figure out how to display an icon image on the display screen or draw the item out by coordinates for better legibility and to make it more interesting.

 

]]>
https://courses.ideate.cmu.edu/60-223/f2018/work/team-james-prototype-documentation/feed/ 0