During the test period, several people in the lab tried it. And they can operate the camera easily. And most of them were surprised when another person’s picture printed out. Generally, they thought it was an interesting experience.
The camera consists of a Raspberry Pi, a thermal receipt printer by Adafruit, and a USB Webcam. You can find the complete code on my Github.
]]>We have created ChristmasViewfinder, an diorama with actualized, moving pieces comparing Christmas as it relates to today’s modern, capitalism-fueled society against its religious origins.
We juxtaposed modern Christmas commercials with the religious figures of the well-known nativity scene. Additionally, different parts of the display change every time the viewer pulls the viewfinder lever. This allows us to tell a multitude of narratives about Christmas’ varying religious and capitalist meanings.
In order to implement this diorama, we used an Arduino Leonardo as the main brain to control all the different actuation. We used two relays to control high voltage lines going to Christmas lights and an incandescent bulb (that lights up old projector slides). We used the Arduino’s PWM pins going to MOSFETS to control the LED strips in the ceiling of the diorama. We also used Adafruit’s 12-bit PWM driver to control additional servos and individual LED’s.
The Arduino Leonardo was also used as a USB keyboard to control a Processing sketch on the computer.
Link to YouTube video detailing how to create a video player in processing: https://youtu.be/ayZIxo3TeXM
Link to Processing documentation of video player (Movie): https://processing.org/reference/libraries/video/Movie.html
Link to Github code: https://github.com/arathorn593/Christmas_Viewfinder
]]>
The Journey Experience is a extension of the Journey Car in a previous project. Its an attempt at trying to get more meaning behind the game and increase player cooperation compared to its previous iteration.
To reach the goal of creating cooperation I needed to create two 3pi controlled systems that are controlled by wixel chips that would allow for remote control of the 3pi movements. There would also need to be a camera system so that each player could see one another
Above is a picture of the main course. Constructed into three sections: Meeting, Puzzle, Conclusion. In the meeting section the 3pi robots would meet one another and enter the second section together. Once the door opens they would be able to complete a puzzle of creating a ramp. When the ramp is assembled they would climb it to reach the next area. The final area is the conclusion of the “journey” they have commenced and a message would be stated.
The door system of the first area is powered by an arduino system that is activated by a photoresistor and moved by a motor system.
Here is the schematic of the arduino set up
This is the what the puzzle ramp structure would generally be composed of. The robots would have to work together to move these structures to the correct position. All these pieces were laser cut and specifically made such that they can move up the ramp without getting stuck from any inconsistencies
This is the end structure ramp that the 3pi robots would go down on completion of their experience and have a message at the end.
]]>
The Crash Helmet is a bike helmet that marks out locations where people have been killed while cycling to the rest of the Pittsburgh community. The helmet currently uses the location of the recent death of cyclist Susan Hicks near Forbes and Bellefield.
The Crash Helmet uses a LightBlue Bean to activate a high-wattage LED that is powered by a lithium polymer battery. The LightBlue Bean connects to an iPhone over Bluetooth, and when the iPhone comes within 20 meters of a crash site, the LED on the helmet lights up.
A previous version of the project can be found here.
Below is a Fritzing Diagram of our electronics
A video of the project in action is below:
Vimeo / Varun Gupta – via Iframely
A photo of a person(Craig) wearing the helmet is also below
]]>YouTube / Dan Sakamoto – via Iframely
Version 2 is a mixed success; still to be resolved are the issue of getting a tone in the desired frequency range to be omnidirectional, as well as some difficulty getting the iPhone microphone to hear those frequencies. Finally, a signal-producing circuit would pitch-drift when used to generate high-enough sounds, and had to be set aside for now.
Speech snippets in this version are taken from the “The Sims 2”.
Documentation of iteration 1 can be found here.
The signal-generating circuit is based on this tutorial, just with one of the resistors changed in order to change the frequency.
Unity code is on github. Note that it’s the same repository as the last iteration, with the new version of unity file added.
Dog whistle app (used in place of signal-generating circuit) was downloaded from the Apple App Store here.
In this iteration, I established a Wi-Fi connection between two cameras. When the user presses the button on one of the cameras, it will first take a picture of you and upload it to the other camera. Then it will print a latest picture from the other camera’s uploads.
A Portal Cam consists of a Raspberry Pi, a USB Webcam and a thermal receipt printer. The case is laser-cut black acrylic. The sync mechanism is achieved using Dropbox-Uploader. All the programs are written in Python2. You can find the code on my Github.
One thing particular difficult was connecting Raspberry Pi to CMU-SECURE Wi-Fi. I uploaded my network configure file to the Github for your reference.
]]>YouTube / Jonathan Dyer – via Iframely
This project is a second iteration of the project documented here: Repman Part 1
To improve on this project, we did three main things: improved the appearance of the band, created an app, and improved the signal processing.
First, we improved the appearance by making a more robust button. From our first iteration we learned that users really liked our logo so we decided to make the logo a button. To do so, we had to make the button larger and moved the dumbells from the top of the logo to the side allowing us to cut the logo completely out of conductive fabric. We then used fabric adhesive to attach it to the wristband and conductive thread to attach it to the rest of the circuit.
Second, we created an app that allows the user to interface with the wristband. The user can input values for number of reps, number of sets, and rest period between sets.
Lastly, we improved the signal processing by playing around with various parameters pertaining to the peak detection. The code for both the app and the arduino for the light blue bean can be found here: App/Arduino Code
The circuit diagram is below:
]]>Created by Jaime Chu and Robert Rudolph
Ideation was inspired by the students in the Physical Computing class in IDeATe. We wanted to create a project that represents all innovation occurring in IDeATe’s basement studios. While we were doing this project, we also realized that many people were not aware of IDeATe as a program, so we wanted to expose them to it in a unexpected and creation interaction by bringing Ideation to them.
We chose to place our installation in the stairwell because it is acousting and visually interesting as a space. It was also the only place in the library where it is acceptable for students to make noise and for this installation to capture that noise. Lastly, this ties the IDeATe program in the basement to the rest of the world in a way that wouldn’t be possible with an installation on a single floor.
Ideation responds to naturally occurring in the stairwell such as footsteps, door closes, voices and the elevator. However, it also provides unique interaction when users clap, play music or even whistles. Each light bulb corresponds to a narrow band of frequencies. The lower the frequency the lower bulbs (closest to the basement) will light up and vice versa.
Implementation
Hardware – Light bulbs became the main feature of the installation because they universally represent ideas. Our main focus in design and implementation of any frame or light shade was to highlight the light bulb and the light shining from it. The minimal shades echo the lines of the light bulb from the inside, while projecting the edge-lit geometric structure from the outside. The shades are created from clear acrylic with the edges sanded down and pressed to fit. The light bulbs are powered from the base of the structure and hung with flexible coated steel wire ropes. All of the cables span 35 feet and begin at the base of the third floor, then anchored at the basement floor with cement blocks. In total there are eight light fixtures which create a 3ft x 3ft box within the 5ft wide stairwell. Each light fixture is spaced approximately 4 feet apart.
Software – Using two four channel DMX boxes, an Arduino and Processing, we were able to control and dim the lights depending on the noise levels within the stairwell to express all of the activity that happens. When the noise levels increases, the light bulbs light up according to the different frequency levels. The bulbs closer to the basement correspond to the lower frequencies while the bulbs closer to the third floor correspond to the higher frequencies.
Rhino files can be found here.
Code (Arduino + Processing).
Special thanks Zach Ali, Ignatios Alexander, Ali Momeni, and the Facilities Management Services for all of their help and support throughout this entire project.
]]>YouTube / Dan Sakamoto – via Iframely
Telemouths is a system which allows an actor to take control of another person’s mouth, effectively turning them into a live puppet. A participant wears the device on their face, obscuring their mouth and preventing them from speaking, while imposing a new mouth via the screen. An actor is then able to remotely speak for the participant; by speaking into a microphone, their voice is transmitted through the mask. The actor is also able to control the expression of the mouth via a Photon wireless micro-controller.
The system was designed with commonly available components in mind: Sound is transmitted from the actor’s laptop to the phone via voice over IP, and control signals are sent over wifi from a Photon attached to the microphone to the phone. The Photon is powered via USB cable running with the microphone cable. The iPhone can be attached to the face using a headband and some velcro.
Version 1 aims for accessibility in the hopes of seeing experiments in a few kinds of theatrical contexts, from possible empathy-building workshops to large-scale immersive theater. Feedback from the initial prototype showed that there was interest in the idea, but that it was difficult to imagine what the possibilities for a device like this could be without seeing it in action. As a result, a workshop was organized with a small group of theater practitioners to play and improvise and see how it felt, as can be seen in the video above. While everyone had fun, those in the room were in agreement about wanting to see next how the energy would change with a prepared text. Two people expressed interest in trying to use Telemouths in upcoming projects, so next steps will be to figure out a text to try out and prepare a workshop for version 2.
Photon code: via Github
Circuit Diagram: