Project Documentation – F15 60-223: Intro to Physical Computing https://courses.ideate.cmu.edu/60-223/f2015 Carnegie Mellon University, IDEATE Thu, 17 Dec 2015 20:19:25 +0000 en-US hourly 1 https://wordpress.org/?v=4.5.31 D3b: Portal Cam https://courses.ideate.cmu.edu/60-223/f2015/d3b-portal-cam/ https://courses.ideate.cmu.edu/60-223/f2015/d3b-portal-cam/#respond Thu, 17 Dec 2015 17:50:32 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11156 Portal Cam is an exploration to bring more community awareness in the CodeLab. In this iteration, I focused on making it more easy to use. Inspired by the Polaroid instant camera, I build a camera case with a button that resembles the shutter button on the camera so the users can intuitively understand the design. 

During the test period, several people in the lab tried it. And they can operate the camera easily. And most of them were surprised when another person’s picture printed out. Generally, they thought it was an interesting experience.

The camera consists of  a Raspberry Pi, a thermal receipt printer by Adafruit, and a USB Webcam. You can find the complete code on my Github.

IMG_4399

IMG_4332

]]>
https://courses.ideate.cmu.edu/60-223/f2015/d3b-portal-cam/feed/ 0
D3a: Portal Cam https://courses.ideate.cmu.edu/60-223/f2015/d3a-portal-cam/ https://courses.ideate.cmu.edu/60-223/f2015/d3a-portal-cam/#respond Thu, 17 Dec 2015 17:14:25 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11098 The aim of this project is to bring more awareness of community in the CodeLab. CodeLab has a diversity group of graduate students from Computational Design, Emerging Media and Tangible Interaction Design. I designed a camera that takes a picture of you but prints out previous person’s picture. So you can know who was at the lab. And hopefully it can make people become more engaged in the community.

IMG_4296

]]>
https://courses.ideate.cmu.edu/60-223/f2015/d3a-portal-cam/feed/ 0
Final Project: Dynamic, Dioramic Exploration of Christmas Over Time https://courses.ideate.cmu.edu/60-223/f2015/final-project-dynamic-dioramic-exploration-of-christmas-over-time/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-dynamic-dioramic-exploration-of-christmas-over-time/#respond Thu, 17 Dec 2015 16:55:00 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11062 by Rachel Nakamura (rnakamur) and Joseph Paetz (rpaetz)

We have created ChristmasViewfinder, an diorama with actualized, moving pieces comparing Christmas as it relates to today’s modern, capitalism-fueled society against its religious origins.

We juxtaposed modern Christmas commercials with the religious figures of the well-known nativity scene. Additionally, different parts of the display change every time the viewer pulls the viewfinder lever. This allows us to tell a multitude of narratives about Christmas’ varying religious and capitalist meanings.

In order to implement this diorama, we used an Arduino Leonardo as the main brain to control all the different actuation. We used two relays to control high voltage lines going to Christmas lights and an incandescent bulb (that lights up old projector slides). We used the Arduino’s PWM pins going to MOSFETS to control the LED strips in the ceiling of the diorama. We also used Adafruit’s 12-bit PWM driver to control additional servos and individual LED’s.

The Arduino Leonardo was also used as a USB keyboard to control a Processing sketch on the computer.

Link to YouTube video detailing how to create a video player in processing: https://youtu.be/ayZIxo3TeXM

Link to Processing documentation of video player (Movie): https://processing.org/reference/libraries/video/Movie.html

Link to Github code: https://github.com/arathorn593/Christmas_Viewfinder

 

DSC_6004DSC_6018 DSC_6005DSC_6043DSC_6059DSC_6067DSC_6080DSC_6085DSC_6087DSC_6095DSC_6092DSC_6089DSC_6100

DSC_6109 DSC_6107 DSC_6104DSC_6127DSC_6138DSC_6140DSC_6115DSC_6112

DSC_6124 DSC_6122 DSC_6117DSC_6312DSC_6293DSC_6286DSC_6318 DSC_6299

DSC_6270 DSC_6269DSC_6267 DSC_6279DSC_6274DSC_6277DSC_6276DSC_6283DSC_6280DSC_6285DSC_6284DSC_6297DSC_6414DSC_6410DSC_6320DSC_6326DSC_6339DSC_6348DSC_6349DSC_6358DSC_6364DSC_6365DSC_6367DSC_6369DSC_6382DSC_6375

DSC_6418 DSC_6425 DSC_6426 DSC_6427 DSC_6428DSC_6453DSC_6452DSC_6439DSC_6434DSC_6433DSC_6456DSC_6507DSC_6500DSC_6490DSC_6460DSC_6523DSC_6524DSC_6526DSC_6530DSC_6550DSC_6554DSC_6562DSC_6564DSC_6566DSC_6579DSC_6590DSC_6591DSC_6592

DSC_6633 DSC_6638 DSC_6642

 

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-dynamic-dioramic-exploration-of-christmas-over-time/feed/ 0
Final Project: Journey Experience https://courses.ideate.cmu.edu/60-223/f2015/final-project-journey-experience/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-journey-experience/#respond Thu, 17 Dec 2015 16:42:22 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11044 By: Roberto Andaya

DSC_0002

The Journey Experience is a extension of the Journey Car in a previous project. Its an attempt at trying to get more meaning behind the game and increase player cooperation compared to its previous iteration.

To reach the goal of creating cooperation I needed to create two 3pi controlled systems that are controlled by wixel chips that would allow for remote control of the 3pi movements. There would also need to be a camera system so that each player could see one another

DSC_0005

Above is a picture of the main course. Constructed into three sections: Meeting, Puzzle, Conclusion. In the meeting section the 3pi robots would meet one another and enter the second section together. Once the door opens they would be able to complete a puzzle of creating a ramp. When the ramp is assembled they would climb it to reach the next area. The final area is the conclusion of the “journey” they have commenced and a message would be stated.

 

DSC_0006

The door system of the first area is powered by an arduino system that is activated by a photoresistor and moved by a motor system.

Arduino Code

IMAG0895

Here is the schematic of the arduino set up

DSC_0009

This is the what the puzzle ramp structure would generally be composed of. The robots would have to work together to move these structures to the correct position. All these pieces were laser cut and specifically made such that they can move up the ramp without getting stuck from any inconsistencies

DSC_0008

This is the end structure ramp that the 3pi robots would go down on completion of their experience and have a message at the end.

 

 

 

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-journey-experience/feed/ 0
Final Project: Crash Helmet https://courses.ideate.cmu.edu/60-223/f2015/final-project-crash-helmet/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-crash-helmet/#respond Thu, 17 Dec 2015 16:31:20 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11066 By Varun Gupta (varung1) and Craig Morey (cmorey)   

The Crash Helmet is a bike helmet that marks out locations where people have been killed while cycling to the rest of the Pittsburgh community. The helmet currently uses the location of the recent death of cyclist Susan Hicks near Forbes and Bellefield.

The Crash Helmet uses a LightBlue Bean to activate a high-wattage LED that is powered by a lithium polymer battery. The LightBlue Bean connects to an iPhone over Bluetooth, and when the iPhone comes within 20 meters of a crash site, the LED on the helmet lights up.

A previous version of the project can be found here.

Below is a Fritzing Diagram of our electronics

crash_helmet_v3

A video of the project in action is below:

Vimeo / Varun Gupta – via Iframely

 

A photo of a person(Craig) wearing the helmet is also below

 IMG_6928

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-crash-helmet/feed/ 0
Final Project: Telemouths Take 2 https://courses.ideate.cmu.edu/60-223/f2015/final-project-telemouths-take-2/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-telemouths-take-2/#respond Thu, 17 Dec 2015 09:28:17 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11029 Telemouths is an experiment in creating tools to allow for video-game style storytelling in a physical setting. Version 1 allowed a remote user to speak through a mask on another participant’s face; version 2 attempts to incorporate the beginnings of game logic by automating speech, triggered by a user’s proximity to points of interest.

YouTube / Dan Sakamoto – via Iframely

Version 2 is a mixed success; still to be resolved are the issue of getting a tone in the desired frequency range to be omnidirectional, as well as some difficulty getting the iPhone microphone to hear those frequencies. Finally, a signal-producing circuit would pitch-drift when used to generate high-enough sounds, and had to be set aside for now.

Speech snippets in this version are taken from the “The Sims 2”.

Documentation of iteration 1 can be found here.
The signal-generating circuit is based on this tutorial, just with one of the resistors changed in order to change the frequency.
Unity code is on github. Note that it’s the same repository as the last iteration, with the new version of unity file added.
Dog whistle app (used in place of signal-generating circuit) was downloaded from the Apple App Store here.

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-telemouths-take-2/feed/ 0
Final Project: Portal Cam https://courses.ideate.cmu.edu/60-223/f2015/final-project-portal-cam/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-portal-cam/#respond Wed, 16 Dec 2015 22:38:00 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=11007 Portal Cam connects different places and different time. It conveys the moment by something physical rather than digital.

In this iteration, I established a Wi-Fi connection between two cameras. When the user presses the button on one of the cameras, it will first take a picture of you and upload it to the other camera. Then it will print a latest picture from the other camera’s uploads.

A Portal Cam consists of a Raspberry Pi, a USB Webcam and a thermal receipt printer. The case is laser-cut black acrylic. The sync mechanism is achieved using Dropbox-Uploader. All the programs are written in Python2. You can find the code on my Github.

One thing particular difficult was connecting Raspberry Pi to CMU-SECURE Wi-Fi. I uploaded my network configure file to the Github for your reference.

IMG_4393

IMG_4401

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-portal-cam/feed/ 0
Final Project: Repman https://courses.ideate.cmu.edu/60-223/f2015/final-project-repman/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-repman/#respond Wed, 16 Dec 2015 20:52:40 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10967 By: Jonathan Dyer, Anatol Liu, Kiran Matharu

YouTube / Jonathan Dyer – via Iframely

This project is a second iteration of the project documented here: Repman Part 1

To improve on this project, we did three main things: improved the appearance of the band, created an app, and improved the signal processing.

First, we improved the appearance by making a more robust button. From our first iteration we learned that users really liked our logo so we decided to make the logo a button. To do so, we had to make the button larger and moved the dumbells from the top of the logo to the side allowing us to cut the logo completely out of conductive fabric. We then used fabric adhesive to attach it to the wristband and conductive thread to attach it to the rest of the circuit.

DSC_0042DSC_0041DSC_0044

Second, we created an app that allows the user to interface with the wristband. The user can input values for number of reps, number of sets, and rest period between sets.

DSC_0080DSC_0083

Lastly, we improved the signal processing by playing around with various parameters pertaining to the peak detection. The code for both the app and the arduino for the light blue bean can be found here: App/Arduino Code

DSC_0088DSC_0087

The circuit diagram is below:

Repman Circuit

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-repman/feed/ 0
Final Project – Ideation https://courses.ideate.cmu.edu/60-223/f2015/final-project-ideation/ https://courses.ideate.cmu.edu/60-223/f2015/final-project-ideation/#respond Wed, 16 Dec 2015 03:39:37 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10955 DSC_6230

IMG_6805DSC_6182DSC_6153DSC_6155

Created by Jaime Chu and Robert Rudolph

Ideation was inspired by the students in the Physical Computing class in IDeATe. We wanted to create a project that represents all innovation occurring in IDeATe’s basement studios. While we were doing this project, we also realized that many people were not  aware of IDeATe as a program, so we wanted to expose them to it in a unexpected and creation interaction by bringing Ideation to them.

We chose to place our installation in the stairwell because it is acousting and visually interesting as a space. It was also the only place in the library where it is acceptable for students to make noise and for this installation to capture that noise. Lastly, this ties the IDeATe program in the basement to the rest of the world in a way that wouldn’t be possible with an installation on a single floor.

Ideation responds to naturally occurring in the stairwell such as footsteps, door closes, voices and the elevator. However, it also provides unique interaction when users clap, play music or even whistles. Each light bulb corresponds to a narrow band of frequencies. The lower the frequency the lower bulbs (closest to the basement) will light up and vice versa.

Implementation

Hardware – Light bulbs became the main feature of the installation because they universally represent ideas. Our main focus in design and implementation of any frame or light shade was to highlight the light bulb and the light shining from it. The minimal shades echo the lines of the light bulb from the inside, while projecting the edge-lit geometric structure from the outside. The shades are created from clear acrylic with the edges sanded down and pressed to fit. The light bulbs are powered from the base of the structure and hung with flexible coated steel wire ropes. All of the cables span 35 feet and begin at the base of the third floor, then anchored at the basement floor with cement blocks. In total there are eight light fixtures which create a 3ft x 3ft box within the 5ft wide stairwell. Each light fixture is spaced approximately 4 feet apart.

Software – Using two four channel DMX boxes, an Arduino and Processing, we were able to control and dim the lights depending on the noise levels within the stairwell to express all of the activity that happens. When the noise levels increases, the light bulbs light up according to the different frequency levels. The bulbs closer to the basement correspond to the lower frequencies while the bulbs closer to the third floor correspond to the higher frequencies.

Rhino files can be found here.
Code (Arduino + Processing).

Special thanks Zach Ali, Ignatios Alexander, Ali Momeni, and the Facilities Management Services for all of their help and support throughout this entire project.

YouTube / Bob Rudolph – via Iframely

]]>
https://courses.ideate.cmu.edu/60-223/f2015/final-project-ideation/feed/ 0
Wearable: Telemouths https://courses.ideate.cmu.edu/60-223/f2015/wearable-telemouths/ https://courses.ideate.cmu.edu/60-223/f2015/wearable-telemouths/#respond Fri, 04 Dec 2015 03:24:16 +0000 http://courses.ideate.cmu.edu/physcomp/f15/60-223/?p=10854

YouTube / Dan Sakamoto – via Iframely

Telemouths is a system which allows an actor to take control of another person’s mouth, effectively turning them into a live puppet. A participant wears the device on their face, obscuring their mouth and preventing them from speaking, while imposing a new mouth via the screen. An actor is then able to remotely speak for the participant; by speaking into a microphone, their voice is transmitted through the mask. The actor is also able to control the expression of the mouth via a Photon wireless micro-controller.

The system was designed with commonly available components in mind: Sound is transmitted from the actor’s laptop to the phone via voice over IP, and control signals are sent over wifi from a Photon attached to the microphone to the phone. The Photon is powered via USB cable running with the microphone cable. The iPhone can be attached to the face using a headband and some velcro.

Version 1 aims for accessibility in the hopes of seeing experiments in a few kinds of theatrical contexts, from possible empathy-building workshops to large-scale immersive theater. Feedback from the initial prototype showed that there was interest in the idea, but that it was difficult to imagine what the possibilities for a device like this could be without seeing it in action. As a result, a workshop was organized with a small group of theater practitioners to play and improvise and see how it felt, as can be seen in the video above. While everyone had fun, those in the room were in agreement about wanting to see next how the energy would change with a prepared text. Two people expressed interest in trying to use Telemouths in upcoming projects, so next steps will be to figure out a text to try out and prepare a workshop for version 2.

IMG_0897

_MG_9542

Photon code: via Github

Circuit Diagram:
Telemouths Circuit_bb

]]>
https://courses.ideate.cmu.edu/60-223/f2015/wearable-telemouths/feed/ 0