Assignment 4: Did I leave that state machine on?

State Machine(s): Kitchen Appliances

Problem: How many times have you made dinner one night and then woke up to a warmer apartment and/or kitchen the next day? Maybe a slight odor of gasoline? I can distinctly remember doing this twice… because I forgot to turn the stove off. 

My electric stove has a knob to turn it on and specify the level of heat. Underneath the knob, is supposed to be a helpful indicator light that reminds the cook that the stove is on. Finally, there is an oven timer the cook can set to remind them to take their food off the stove. While the indicator light is a great idea, it does not help a whole lot when the cook is away from the stove. Along with the slight buzz of the stove being on and heat radiating, all of the stove’s feedback is useful for a cook at the stove; however, it is less than useless for a cook who has left the kitchen.

Describe the general solution: In the smart house of five years from now, doorways could house monitoring systems for various appliances or systems that would ensure that someone walking around the house can see if things in other rooms are left on.

Proof of Concept: I wired up a potentiometer to represent a stove or oven knob and an LCD screen (that includes a potentiometer) to represent a monitoring system to the microcontroller board. Essentially, the LCD screen reads out an “on” or “off” state of whatever appliance is connected to it. This system could allow for multiple appliances to be held accountable by the monitoring system; it also could allow for different users/homes to set preferences for various alerts/notifications for various states (flashing screen, no text when things are off, etc.).

Assignment 4 Files: Frtizing Diagram, Arduino Sketches, Prototype Video


Assignment 3: Bedroom Way-finding

Problem: Bedroom floors (especially mine) are usually in constant states of disarray… but they are variable states of disarray. Sometimes I leave my backpack in the middle of the floor or pull my desk chair to the foot of my bed or any number of things. No matter the situation, everyone can relate to tripping on any number of items on the floor of your bedroom because the lights are off in the room… Can someone develop a way-finding system for rooms when the overhead lights are off to 1) avoid waking others in the room and 2) avoid stepping on/tripping over things?

Describe the general solution: In the smart house of five years from now, each floor would be equipped with pressure sensors and pinhole-sized LEDs. As someone wakes up and looks to leave their bed, they can press one button on their bedside monitor to see a softly-lighted, real-time path charted for them on their floor.  

Proof of Concept: Essentially, someone presses and holds the button on the console to turn the device on. While on, the device reads in data from the pressure sensors on the floor – wherever those sensors read in additional weight, those areas get marked as a location where an object was detected and then triggers the corresponding LED to not turn on. In effect, pressing the console’s switch illuminates (with very soft light, as to not wake up others in the room and to be easily identified by your own eyes adjusting to being awake) the locations in the room where someone can step to get to their destination. Eventually, using Machine Learning/AI techniques, the console could plot your best path to a certain destination given the time you wake up and your own tendencies (to the shower if it is 7am or to the fridge for a late night snack at 12:30am).

In this demo, I can press the momentary switch to simulate turning the whole system “on”. With the button pressed, I can apply pressure to one of the 3 round FSR’s which causes its corresponding LED to turn off, signaling that you should not walk in that position.

Assignment 3 (Sketch, Frtizing Files)

Assignment 2: Raising a Digital Hand

Find a problem to solve:   All of the 80 first-year ETC students take a class called Building Virtual Worlds that splits them into one of three different roles: sound designer, programmer, or artist. Students essentially make new video games every two weeks, usually on different software or hardware platforms they are learning about as they build. As you can expect, this means there are a lot of questions for TAs to answer; however, not all questions can be answered by each TA. We have at least 6 different types of TA that all specialize in their own field/software/etc. One of the biggest problems in the class is that when students ask for help, the TA on duty never seems to specialize in the field they have a question, so it takes them finding another TA to answer it. This game of telephone usually results in a) longer wait times for each student, b) fewer total questions being answered, or c) questions being answered by TAs that may not be qualified (3D modelers answering hardware coding questions for example). In addition to all of this, all 80+ students sit in the same office space so it is really a test of a student’s luck whether or not a TA will see them when he or she walks by or if they have to wait for them to circle back around.

Describe the general solution: Students should be able to request help from specific TAs to answer their specific questions quickly and easily. TAs should also have a system so they can track what types of questions are asked where, the order in which they are asked, and who is going to answer them. Both parties should also both be able to see the status of the question (asked, waiting, answering, answered, etc.). 

Proof of Concept:  An Arduino with a button, slider/potentiometer input, RGB LED, an LCD monitor and a computer program interface with a state machine of the question-asking/answering process.  When students have a question, they can adjust the slider to select which type of TA they would like help from. The LED would turn on (in the corresponding color of the TA group) to signal that the TAs have been alerted. On the TA-facing computer program, TAs can see a map of where the students are and their questions based on the colors. TAs can start the answering process by selecting and attaching their name to a question, which is communicated to the student’s device through 1) the LED flashing and 2) the name of the TA appearing on the screen because they are on their way down to help. Once at the student’s desk, the TA can press and hold the button on the student’s device to alert the other TAs that the student is being helped by changing the color of the LED on the device and on the UI. Finally, once the question is answered, the TA can press and hold the device’s button again to turn the LED off.

Fritzing Sketch: Disclaimer – likely not accurate, playing with software and different components.

First Iteration of Device Model:           

Student & TA User Journey:                     

Demo TA-facing UI & Student Device:


Responses, First Thoughts

1. Tom Igoe – Igoe seems to have most of the trends covered, the only thing I could potentially see being added would just be adding any artifact to the “Internet of Things”. People have fallen in love with making things “smart” and it seems like they are willing to try it with anything and everything, which to me, seems like a bit of a fruitless venture that does not offer very high ROI in terms of helping people solve real problems. I’m sure there are still “smart” artifacts that have not been made “smart” yet and people have to push the envelope to get to those, but the concept has come up a lot in recent memory. I also think that there is some overlap in some of the concepts that Igoe mentions that have also been a major part of the interactive art installation trend that has emerged. Groups like teamlab ( and Artechouse ( have been designing and showcasing interactive experiences using physical and digital sensors that play with and affect different visual displays (*couch* LED fetishism), soundscapes and visual components. I think there will continue to be a trend to use physical computing for art’s own sake compared to having a function.

2. I feel like Banks’ thoughts on the man and and his “smart” space-suit illustrate the achievements and dangers of great design. On the more obvious end of the good stuff, the suit is able to care for and assist the man who is slowly being burdened by sickness, malnutrition and injuries, carrying him farther than he could have ever carried himself. I have always learned that good design is good design because a user doesn’t even realize it is there. Deeper than when people talk about their phones being an extension of themselves, the space-suit becomes intrinsically tied to the man’s being. By the end of the story, the lines have blurred to the point where neither “being” knows where one starts and the other begins. In some ways, that is great design in that it allows the “artifact” to best be used/know how to help the user. On the other hand, that is one of the pitfalls of great design in that sometimes the “artifact” may not give the user the space to be a user, instead forcing him or her to rely on it so much to where they lose their own sentience at the artifact’s expense. The story itself seemed to be cautionary tale for designers, to balance the role of the user and the artifact and the relationship between the two.

3. Not sure if this was supposed to be part of the blog post, but just to cover my bases… In addition to the pre-reqs, I did my undergrad in mechanical engineering with concentrations in design, manufacturing, and psychology,  I have done a semester of improv, I have worked with designers to make accessible museum exhibits and experiences, I took an Intro to EE class in undergrad about microcontrollers, my Senior Design Project was a thermal energy control system run by Arduinos, I know SolidWorks and CREO and Maya, and I’ve laser cut and 3D printed and milled and used most woodshop tools you can think of.