rpaetz – F15 60-223: Intro to Physical Computing https://courses.ideate.cmu.edu/60-223/f2015 Carnegie Mellon University, IDEATE Thu, 17 Dec 2015 20:19:25 +0000 en-US hourly 1 https://wordpress.org/?v=4.5.31 Final Project: Dynamic, Dioramic Exploration of Christmas Over Time https://courses.ideate.cmu.edu/60-223/f2015/final-project-dynamic-dioramic-exploration-of-christmas-over-time/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-dynamic-dioramic-exploration-of-christmas-over-time/#respond Thu, 17 Dec 2015 16:55:00 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11062 by Rachel Nakamura (rnakamur) and Joseph Paetz (rpaetz)

We have created ChristmasViewfinder, an diorama with actualized, moving pieces comparing Christmas as it relates to today’s modern, capitalism-fueled society against its religious origins.

We juxtaposed modern Christmas commercials with the religious figures of the well-known nativity scene. Additionally, different parts of the display change every time the viewer pulls the viewfinder lever. This allows us to tell a multitude of narratives about Christmas’ varying religious and capitalist meanings.

In order to implement this diorama, we used an Arduino Leonardo as the main brain to control all the different actuation. We used two relays to control high voltage lines going to Christmas lights and an incandescent bulb (that lights up old projector slides). We used the Arduino’s PWM pins going to MOSFETS to control the LED strips in the ceiling of the diorama. We also used Adafruit’s 12-bit PWM driver to control additional servos and individual LED’s.

The Arduino Leonardo was also used as a USB keyboard to control a Processing sketch on the computer.

Link to YouTube video detailing how to create a video player in processing: https://youtu.be/ayZIxo3TeXM

Link to Processing documentation of video player (Movie): https://processing.org/reference/libraries/video/Movie.html

Link to Github code: https://github.com/arathorn593/Christmas_Viewfinder

 

DSC_6004DSC_6018 DSC_6005DSC_6043DSC_6059DSC_6067DSC_6080DSC_6085DSC_6087DSC_6095DSC_6092DSC_6089DSC_6100

DSC_6109 DSC_6107 DSC_6104DSC_6127DSC_6138DSC_6140DSC_6115DSC_6112

DSC_6124 DSC_6122 DSC_6117DSC_6312DSC_6293DSC_6286DSC_6318 DSC_6299

DSC_6270 DSC_6269DSC_6267 DSC_6279DSC_6274DSC_6277DSC_6276DSC_6283DSC_6280DSC_6285DSC_6284DSC_6297DSC_6414DSC_6410DSC_6320DSC_6326DSC_6339DSC_6348DSC_6349DSC_6358DSC_6364DSC_6365DSC_6367DSC_6369DSC_6382DSC_6375

DSC_6418 DSC_6425 DSC_6426 DSC_6427 DSC_6428DSC_6453DSC_6452DSC_6439DSC_6434DSC_6433DSC_6456DSC_6507DSC_6500DSC_6490DSC_6460DSC_6523DSC_6524DSC_6526DSC_6530DSC_6550DSC_6554DSC_6562DSC_6564DSC_6566DSC_6579DSC_6590DSC_6591DSC_6592

DSC_6633 DSC_6638 DSC_6642

 

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-dynamic-dioramic-exploration-of-christmas-over-time/feed/ 0
Fabric Sensor: Soft Sensor https://courses.ideate.cmu.edu/60-223/f2015/fabric-sensor-soft-sensor/ https://courses.ideate.cmu.edu/60-223/f2015/fabric-sensor-soft-sensor/#respond Thu, 03 Dec 2015 15:59:14 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10818 (link to our past documentation of this project)

(code for this project can be found here)

We designed a fabric sensor intended for the user to take one-word notes and send said notes as reminders to themselves.

Our primary purpose in creating this sensor was to write reminders on a device without the rigidity found in most devices offering the same service.

What our project actually ended up being was an exploration in soft wearables and sensing.
We discovered that the reason soft wearables are not the first choice of many is because the margin for error is so much smaller when dealing with dynamic material.

We also discovered several key aspects of creating soft touch pads. Most importantly is the overall construction. To sense position, we used three layers of material:

  • conductive fabric (grounded)
  • a spacing mesh
  • a piece of velostat (with 4 connections to measure resistance)

By pressing on the fabric, a connection is made between the conductive fabric and a point on the velostat. The resistance from each measurement point to the point of contact is then measured with an ADC.

We experimented with several conductive fabrics, spacing meshes, and connection orientations. Our process can be found in our previous documentation (linked above).

Also, the xy data from the sensor needs to be linearized. The linearization method depends on the orientation of the connections, but for our final prototype we linearized using the catenery curve (http://forum.arduino.cc/index.php?topic=184285.0).

DSC_5562

plan for information flow with the sensor

DSC_5565

hypothetical layout of the sensor

DSC_5569

rudimentary visual prototype

DSC_5577

visual prototype in use

DSC_5581

visual prototype in use

DSC_5592

visual prototype in use

DSC_5599

sewing conductive fabric to make the buttons

DSC_5603

DSC_5608

sewing wires attached to conductive fabric buttons

DSC_5612

sewing the arm straps on and defining the sensing area

DSC_5614

DSC_5616

inner velostat hand-sewn (because we were afraid that the sewing machine would rip through the velostat given its thickness), wires at four corners with copper tape on ends, 

DSC_5617

light blue bean, with wires soldered to it to connect it to the ADS 1015 Breakout to give us four analog breakout pins instead of two, which are how many are available on the light blue bean

DSC_5625

testing the sensor with visual output in Processing

 

circuit diagram proj 3

circuit diagram of our ADS 1015 Breakout to light blue bean

YouTube / Rachel N – via Iframely

getting the light blue bean’s accelerometer data

YouTube / Rachel N – via Iframely

testing in progress

 

Final project:

DSC_0001

in off mode; no LED on

DSC_0002

on, as indicated by red LED. ready to write.

DSC_0003

simple controls to decide when we start, end, and give ourselves reminders

DSC_0004

light blue bean circuitry and wires hidden under fabric flap!

YouTube / Rachel N – via Iframely

]]>
https://courses.ideate.cmu.edu/60-223/f2015/fabric-sensor-soft-sensor/feed/ 0
Fabric Sensor Prototype https://courses.ideate.cmu.edu/60-223/f2015/fabric-sensor-prototype/ https://courses.ideate.cmu.edu/60-223/f2015/fabric-sensor-prototype/#respond Thu, 19 Nov 2015 14:47:32 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10713 By Rachel Nakamura (rnakamur) and Joseph Paetz (rpaetz)

We wanted to create a soft, fabric sensor that one could interact with to create reminders for later if one doesn’t have paper, pens, notebooks, or ways to record reminders immediately available.

View post on imgur.com

View post on imgur.com

The idea of creating something more wearable and natural than a smartwatch is what first appealed to us. We did a lot of experimentation with various materials, various mesh spacing fabrics, arrangements of connectors, and in particular, our calibration algorithms.

View post on imgur.com

View post on imgur.com

View post on imgur.com

View post on imgur.com

View post on imgur.com

View post on imgur.com

View post on imgur.com

YouTube / Rachel N – via Iframely

 

We also played around with Light Blue Bean and got it to talk to a Macbook, but getting an Android phone to control it is another problem entirely.

For further, more detailed documentation of our working process for the past few weeks can be found here:

Google Docs – via Iframely

]]>
https://courses.ideate.cmu.edu/60-223/f2015/fabric-sensor-prototype/feed/ 0
Sensing Prototype: Collabright https://courses.ideate.cmu.edu/60-223/f2015/collabright/ https://courses.ideate.cmu.edu/60-223/f2015/collabright/#respond Tue, 22 Sep 2015 03:26:13 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10321 Ninety percent of communication is nonverbal. Though everyday life is full of nonverbal communication such as traffic signals, morning alarms, and even facial expressions, people often find themselves unable to work as a team once they can no longer verbally speak to one other.

So our group posed the question, what if we forced two people to work together by communicating through outputs appealing to two different senses?

Collabright bridges two users by allowing one to control sound output in the form of the classic “Happy Birthday” and the other to control visual output in the form of a strip of LED lights. The user whose output is sound controls the song by running a piece of wire down a length of rubber tubing at varying speeds and positions while the user who changes the LEDs controls the lights by running a piece of wire around a color wheel, also composed of rubber tubing.

The idea is that the users can create an experience together that is more powerful than when either one works alone.

]]>
https://courses.ideate.cmu.edu/60-223/f2015/collabright/feed/ 0