Vimeo / Thomas Eliot – via Iframely
How it works:
Agar plates can be inserted into the viewing chamber, where photographs are taken with a canon DSLR. There is a python backend that uses gphoto2 to control the camera, and OpenCV to process the image. A 4-connectivity algorithm is implemented to detect bacterial colony areas and locations. Colony data is transmitted to Processing via OpenOSC. The colonies are drawn in Processing, and can be interacted with. Clicking on a colony causes that colony to propogate outwards, creating tones whenever the wave intersects with another colony. Tones are created by stretching or shortening a 2 second 440Hz sinusoid clip. The pitch of the tone is based on the proportions of the areas of the expanding and colliding bacteria.
The setup gave some nice photographs of the colonies themselves, too
]]>An iPhone runs OpenCV on the camera image to identify faces. It generates a name and date of death for each body and fixes it to an interpolated chest position. This is projected back onto the person with a Optima ML750 picoprojector connected to the iPhone via a lightning to HDMI adapter. The iPhone and projector are aligned and made into the form of a handgun using an Open Beam chassis. ‘Firing’ the gun by recoiling it upwards displays another name, using the iPhone gyroscope.
Previously I had received feedback on the first iteration of the project. The cardboard phone attachment was replaced with an aluminum construction, making the device more durable and professional-looking. Another criticism received was that projecting just a heart onto a person is a task that a simple flashlight can do. So the content of the projection was expanded.
Taking the project to the next step would mean strengthening the emotional connection to the victims portrayed, as a name & date is a relatively clinical representation of a life. The device could be made into a weapon attachment for use in empathy training of police forces. Users reported that the light from the projector was too bright when pointed directly at them. I reduced the brightness and found that the effect was improved and the face recognition functioned better.
Code:
]]>video:
iPhone Obj-C:
]]>YouTube / Bob Rudolph – via Iframely
Our telepresence robot is based on the Pololu 3pi platform. We’ve added an ESP8266 wireless module that acts as a TCP client to provide remote control. It connects to the 3pi via serial. The ESP8266 uses the iPhone’s cellular data connection to connect to the internet. The iPhone runs Facetime to provide remote audio and visual.
We control the robot using a server and facetime on a computer. The server is written in Python, it takes keyboard commands and sends TCP packets.
We noticed that inhabiting the robot can cause some interesting effects. Controlling the robot can give the feeling of reduced responsibility, resulting in actions that divert from social norms. The robot lives on the floor, giving the user the perspective of a mouse. Humans and their feet become dangerous. After inhabiting the robot for more than 30 minutes continuously, I became frustrated with the helplessness I felt.
Future improvements could include:
Changing to the form of the robot to be more playful and inviting.
Controlling the robot using facial recognition.
Scripting interactions
Code
Robot Control Server: robotControl.py
Note that the machine you are running the server on must have it’s TCP port exposed to the internet. If your machine is on the ‘CMU’ (unsecured) network, this is already the case. Almost anywhere else, you need to set up port forwarding on your router. It also helps to have a static IP, so you never have to program a new server IP into the ESP8266.
ESP8266 code: TCPClient.ino
This code is for the Arduino IDE. You can install ESP8266 support in the IDE: Enter http://arduino.esp8266.com/stable/package_esp8266com_index.json into Additional Board Manager URLs field in the Arduino v1.6.4+ preferences. The schematic for programming the ESP8266 can be found here http://www.xess.com/blog/esp8266-reflash/. They use a different utility for programming but the schematic works.
Robot code: 3pi code
Upload using Atmel AVR studio
Schematics
Manufacturing
Chassis: Laser cut files
Made in solidworks. For 1/8″ ply
Vimeo / Thomas Eliot – via Iframely
code & laser cutter drawings
stepper motor schematic
light sensor schematic
YouTube / amazon – via Iframely
]]>
]]>