During the test period, several people in the lab tried it. And they can operate the camera easily. And most of them were surprised when another person’s picture printed out. Generally, they thought it was an interesting experience.
The camera consists of a Raspberry Pi, a thermal receipt printer by Adafruit, and a USB Webcam. You can find the complete code on my Github.
]]>For my final project for physical computing, I decided to advance with my progress on the Spark Wand. If you want to see how my part has evolved from it’s last primary iteration, the link to that is:
The Spark Wand is part of a multidisciplinary theatrical project integrating quadcopters, a controller (the spark wand), motion-capture technologies and an actor controlling the system. The system of operations is integrated together using what is known as the Robotic Operating System (ROS). ROS does a great job at being able to get signals and operations between multiple software and systems working together in a cohesive manner. As part of my assignment to this project, I had to communicate with ROS and I did this through ROSPy and I can show you what the results will look like.
Working with the aerial robotics lab and some of the members involved in the project, I was able to create the first prototype of the Spark Wand. The reason why I call it that is because it uses a component called a Spark/Particle Photon. The Photon is a special Arduino-like micro-controller that has an integrated WiFi component that uses the same programming language as Arduino and has a bit more features like web-based development and browser-based testing. The primary reason I used this board was because of its convenient form factor. Its dimensions are 1.44 in. x 0.8 in. x 0.27 in. with headers on which is very convenient for making an easily-portable and lightweight controller/wand. However, the convenient form factor of the Spark/Particle Photon was only one of the things I was able to use to get a more convenient form factor for the controller itself.
The controller itself comes with 10 buttons and a laser button that relay information into a “dashboard” that is included with the development provided by Particle and also sends information through ROS. The Dashboard lets me monitor the signals, counts how many times they come in and also provide me information of the time the signal is received and gives a graphical representation of it. An example of what it looks like is shown below:
The wand is made to be comfortable, easy to use, and lightweight with no external wires or cables. The latency experienced with a system with even decent connection to WiFi not noticeable and provides reliable results. In this revision of the project, I primarily focused on getting the form factor much sleeker and comfortable. The noticeable changes are that there are no exposed wires. I was able to do this by shielding the robot with a 3D-printed case that housed all of the components, the buttons and let the photon exposed for feedback and access.
Some of the main features is an on/off switch that allowed for power control. A 1C LiPo battery in order to provide power with a sleek packaging. Personalized grippers to house the customized laser pointer in order provide space but keep the laser at a constant position. And a protoboard with all of the wires and buttons in one compact form factor.
]]>We have created ChristmasViewfinder, an diorama with actualized, moving pieces comparing Christmas as it relates to today’s modern, capitalism-fueled society against its religious origins.
We juxtaposed modern Christmas commercials with the religious figures of the well-known nativity scene. Additionally, different parts of the display change every time the viewer pulls the viewfinder lever. This allows us to tell a multitude of narratives about Christmas’ varying religious and capitalist meanings.
In order to implement this diorama, we used an Arduino Leonardo as the main brain to control all the different actuation. We used two relays to control high voltage lines going to Christmas lights and an incandescent bulb (that lights up old projector slides). We used the Arduino’s PWM pins going to MOSFETS to control the LED strips in the ceiling of the diorama. We also used Adafruit’s 12-bit PWM driver to control additional servos and individual LED’s.
The Arduino Leonardo was also used as a USB keyboard to control a Processing sketch on the computer.
Link to YouTube video detailing how to create a video player in processing: https://youtu.be/ayZIxo3TeXM
Link to Processing documentation of video player (Movie): https://processing.org/reference/libraries/video/Movie.html
Link to Github code: https://github.com/arathorn593/Christmas_Viewfinder
]]>
The Journey Experience is a extension of the Journey Car in a previous project. Its an attempt at trying to get more meaning behind the game and increase player cooperation compared to its previous iteration.
To reach the goal of creating cooperation I needed to create two 3pi controlled systems that are controlled by wixel chips that would allow for remote control of the 3pi movements. There would also need to be a camera system so that each player could see one another
Above is a picture of the main course. Constructed into three sections: Meeting, Puzzle, Conclusion. In the meeting section the 3pi robots would meet one another and enter the second section together. Once the door opens they would be able to complete a puzzle of creating a ramp. When the ramp is assembled they would climb it to reach the next area. The final area is the conclusion of the “journey” they have commenced and a message would be stated.
The door system of the first area is powered by an arduino system that is activated by a photoresistor and moved by a motor system.
Here is the schematic of the arduino set up
This is the what the puzzle ramp structure would generally be composed of. The robots would have to work together to move these structures to the correct position. All these pieces were laser cut and specifically made such that they can move up the ramp without getting stuck from any inconsistencies
This is the end structure ramp that the 3pi robots would go down on completion of their experience and have a message at the end.
]]>
The Crash Helmet is a bike helmet that marks out locations where people have been killed while cycling to the rest of the Pittsburgh community. The helmet currently uses the location of the recent death of cyclist Susan Hicks near Forbes and Bellefield.
The Crash Helmet uses a LightBlue Bean to activate a high-wattage LED that is powered by a lithium polymer battery. The LightBlue Bean connects to an iPhone over Bluetooth, and when the iPhone comes within 20 meters of a crash site, the LED on the helmet lights up.
A previous version of the project can be found here.
Below is a Fritzing Diagram of our electronics
A video of the project in action is below:
Vimeo / Varun Gupta – via Iframely
A photo of a person(Craig) wearing the helmet is also below
]]>YouTube / Dan Sakamoto – via Iframely
Version 2 is a mixed success; still to be resolved are the issue of getting a tone in the desired frequency range to be omnidirectional, as well as some difficulty getting the iPhone microphone to hear those frequencies. Finally, a signal-producing circuit would pitch-drift when used to generate high-enough sounds, and had to be set aside for now.
Speech snippets in this version are taken from the “The Sims 2”.
Documentation of iteration 1 can be found here.
The signal-generating circuit is based on this tutorial, just with one of the resistors changed in order to change the frequency.
Unity code is on github. Note that it’s the same repository as the last iteration, with the new version of unity file added.
Dog whistle app (used in place of signal-generating circuit) was downloaded from the Apple App Store here.
In this iteration, I established a Wi-Fi connection between two cameras. When the user presses the button on one of the cameras, it will first take a picture of you and upload it to the other camera. Then it will print a latest picture from the other camera’s uploads.
A Portal Cam consists of a Raspberry Pi, a USB Webcam and a thermal receipt printer. The case is laser-cut black acrylic. The sync mechanism is achieved using Dropbox-Uploader. All the programs are written in Python2. You can find the code on my Github.
One thing particular difficult was connecting Raspberry Pi to CMU-SECURE Wi-Fi. I uploaded my network configure file to the Github for your reference.
]]>Vimeo / Thomas Eliot – via Iframely
How it works:
Agar plates can be inserted into the viewing chamber, where photographs are taken with a canon DSLR. There is a python backend that uses gphoto2 to control the camera, and OpenCV to process the image. A 4-connectivity algorithm is implemented to detect bacterial colony areas and locations. Colony data is transmitted to Processing via OpenOSC. The colonies are drawn in Processing, and can be interacted with. Clicking on a colony causes that colony to propogate outwards, creating tones whenever the wave intersects with another colony. Tones are created by stretching or shortening a 2 second 440Hz sinusoid clip. The pitch of the tone is based on the proportions of the areas of the expanding and colliding bacteria.
The setup gave some nice photographs of the colonies themselves, too
]]>YouTube / Jonathan Dyer – via Iframely
This project is a second iteration of the project documented here: Repman Part 1
To improve on this project, we did three main things: improved the appearance of the band, created an app, and improved the signal processing.
First, we improved the appearance by making a more robust button. From our first iteration we learned that users really liked our logo so we decided to make the logo a button. To do so, we had to make the button larger and moved the dumbells from the top of the logo to the side allowing us to cut the logo completely out of conductive fabric. We then used fabric adhesive to attach it to the wristband and conductive thread to attach it to the rest of the circuit.
Second, we created an app that allows the user to interface with the wristband. The user can input values for number of reps, number of sets, and rest period between sets.
Lastly, we improved the signal processing by playing around with various parameters pertaining to the peak detection. The code for both the app and the arduino for the light blue bean can be found here: App/Arduino Code
The circuit diagram is below:
]]>