Daniel Morris – damorris@andrew.cmu.edu
Entanglement is a networked pair of compasses that point to each other, creating an intimate relationship between the two objects that mimics that of their owners. Each compass uses a Light Blue Bean to connect to the user’s phone over bluetooth. The phone connects to a server and receives the other phone’s location. The user’s phone sends the angle from its GPS location to the other phone to its Light Blue Bean. The Bean calculates the difference between the angle the user is facing based on the attached three axis compass and the angle received from the phone. The resulting angle is then representing on the LED ring, allowing the ring to point towards the direction of the other phone.
Components:
The enclosure of the compasses were created with CNC routed and laser cut wood. A sanded ring of acrylic was placed inside the enclosure to act as a diffuser for the light. The bottom of the compass is a press fit piece of laser cut wood with a small hole to take it out.
The wood was sanded and then stained. We added a feature to the compass where it blinks yellow when it’s not receiving the GPS coordinates of the other compass. For the purposes of the video, we changed it to blink when in close proximity to the other compass. We later changed the LED colors to white.
Physical compass enclosure (CNC and laser cut files) & Light Blue Bean code can be found here
]]>This video is dubbed over for improved audio clarity.
Abstract
These days, the line that separates the real world from the internet world is becoming more and more unclear. People may start a conversation online and pick it back up right where they left off when they meet in person. However, there still exists conversations that remain wholly on the internet. These conversations are prone to misunderstandings and rapidly devolves into a spitting war of crass insults. Oddly enough, most of the participants in these conversations-turned-arguments believe that they hold themselves to a higher standard, as though they were more sophisticated and educated than the people talking in the real world.
We shed light on this strange phenomenon by recreating these online conversations into the real world. Pulling threads from reddit.com(a popular ground for online debates), we animate the conversationalists via moving cloth bodies and lights. The robotic sound of the bodies paired with classical orchestral music hopes to highlight the dissonance between what the beings are saying and what they believe themselves to be.
Technology
The project implements an Arduino and two Raspberry Pis to create a call and response system with the Arduino as the master and the Pis as the slaves. The Arduino first triggers the lights and fans through a relay and then sends a signal to the Pi to begin its message. The lights and fans are held on until the Arduino receives a callback from the Pi and switches, activating the other set of electronics. This process repeats until the conversation ends.
The decision to implement in this way was due mainly to the desire to have distinct sound sources so that the voice originates from each of the beings. Offloading the master duties to the Arduino necessary because of the Pi’s inability to sink 5v to turn off the relays.
]]>Project by: Lissa Biltz, Olivia Lynn, and Kevin Wainczak
Abstract
We require all citizens to take a classification test. This test is being used to put all Americans into two groups: the straights and the others. The New Administration is requiring this to ensure the moral and spiritual sanctity of our Great Nation. This will allow us to take action to remedy any corrupted sub-groups of our population in the future.
Classification Test #4 is a system that uses three wireless sensors to measure biometric data of users viewing images meant to evoke sexual desire in people who are not straight (or asexual). This project started as a way to innocently help people who are not straight to have a physical “gaydar” aide. The changing political climate has shifted the context of our project into a warning about the dangers of blindly categorizing individuals. The idea began as a concept of a fun toy, but has evolved into an intended harbinger of potential times to come.
The sensors are attached in a way similar to medical and torture devices. Our inspiration came from the Canadian Fruit Test and other historical examples of using faulty tests to determine homosexuality in an individual under the guise of medical legitimacy. We record and present the test taker’s biometric data to lend our classification a sense of authenticity, to allow the participant to be more fully immersed in the experience.
Video
https://www.youtube.com/watch?v=vMFy3-idQNo&feature=youtu.be
Diagrams
]]>
Credit to:
Shuli Jiang, David Perry
October~December 2016
In his Commentary on Aristotles Categories, Archytus writes, “Place is the first of all beings”, since everything that exists is in a place and cannot exist without a place. The aim of this project is to allow people to share spaces. We start by making a unique recording device that prompts people, via written text, to “Take Me Somewhere on the CMU campus” and auto-uploads the collected videos to Dropbox. This prompt is intended to inspire users to interact with the object. Their interaction and manipulation of the object will be recorded by the device and it will permanently affect what it records. Putting the object on campus, we successfully collected video data from several locations.
Then we constructed a space to curate all the data we receive, via dropbox, from the recording device within Hunt library’s media lab. The space is to act as a portal, or a window into what the object has experienced, which allows users access to the entire database that the recording device has created. The presentation of the videos are inspired by the idea of “Camera Obscura”, which is the natural optical phenomenon that occurs when an image of a scene at the other side of a screen is projected through a small hole in that screen as a reversed and inverted image on a surface opposite to the opening. A “Camera Obscura” device is also the prototype of the modern camera. In a dark space, we projected down those videos that were masked with dotted effects onto a round screen. By adding the common elements in “Camera Obscura” — dots, projection and dark room, we aimed at giving people the sense of a true “Camera Obscura” that has the magic of connecting them to different places from a small space.
1. Mobile video collection device. (raspberry pi + pi camera)
2. Projector.
Toot2Gether
Virality, Sensuality, and the internet of things
Abstract
In recent years the role social media plays in delivering advertising content to consumers has grown enormously. While user data analysis has become a mainstay for companies trying to target their advertising, self propagating viral content has the power to sway much larger groups at possible fractions of the cost. Many small media groups have attempted to boil down the authenticity and appeal of the independent inventor to package products for social media consumption, and have had a homogenizing effect on the way viral videos are expected to look. For this project my intent was to take this aesthetic and apply it to a supposed invention (Toot2Gether) that has questionable marketability to explore what these narratives of independent invention and authenticity mean without a functional product to back them up. This builds on a product whose purpose hinges on the taboo of discussing bowel movements with other people, and serves as an exploration of what is considered acceptable to discuss through social media that might not be in offline conversations.
Technology
Communication between toilets was handled through an OSC interface connecting both Node MCU’s to a computer running a server on the same wifi network.
When a toilet is in use, the switch (4) tells the node (1) to send an update to the server which in turn alerts the other MCU. When activated, the node sends a signal to the teensy (4) which plays an alert chime, and increases voltage to the power relay (3) allowing current to flow through the nichrome under the toilet seat from the external power supply.
]]>Floffy bird is a tiny fuzzy creature that isn’t afraid of any terrain. Plop it down on any path and it will follow the line all the way to the end. Watch out, though, because it still obeys some of the laws of the world – snow is cold! Floffy bird shivers a lot. Mud is tough, too, so its steps may start dragging but slowly, slowly, floffy bird will make it through.
So what is it scared of? Humans! If you pick it up, it’ll squeak and freeze because it is just a tiny little bird. If you do decide to grab it, try rubbing its belly, and it just might calm down enough to move.
General Description and Objectives
Created by David Perry, Hajin Kim, and Lissa Biltz, Fish out of Water is an interactive, fish robot on the Pololu 3pi robot platform.
The user draws various things on a whiteboard table, and the fish reacts to them appropriately. On its own, the fish moves in a random, sinusoidal path. However, various obstacles cause the fish to change it behaviors. Darker and thicker drawings are considered major obstacles and threats to the fish, so it turns around and moves in a different direction. Thinner lines are interpreted as beneficial to the fish, so the fish moves through them and makes happy noises.
Originally, our purpose was to make a robotic fish able to interpret user drawings exactly and emulate very real behaviors of a fish in a fish tank like environment. However we realized that the 3pi platform would be unable to sense colors of whiteboard markers easily,and that the behavior of the fish would be too repetitive too quickly. While the user can overall trigger various reactions from the fish, there is a lot of ambiguity in what the fish will decide is a thick or thin line, emulating the fact that the fish has free will. Also the appearance of the fish is not meant to replicate any species of fish specifically. So as each user draws and discovers the reactions of the fish, they are creating their own environment and specific fish in the water.
Narrative
Some people consider looking at fish as a relaxing activity. Amidst the stress culture of CMU, Fish out of Water was designed to calm students down in ordinary places, such as a library or a residence hall. However, since user’s drawings are what influence the fish’s behavior, experiences and result change from person to person.
Work Distribution
David Perry- Created the default path for the fish. Programed the sensors on the 3pi to recognize the difference between thin and thick lines. Programmed the behavior of the fish.
Hajin Kim- Designed the overall appearance of the fish through various prototypes. Laser cut and fabricated the fish.
Lissa Biltz- Wired the LED, explored fish sounds, Programmed behavior of the fish.
Initial Prototype
Circuit Diagram
Link to download source code and dxf file
]]>
Given this election year’s colorful presidential candidates, we decided to create something to show the parallels between the two top candidates. Presenting Angry Politics, featuring HillaryBot and TrumpBot. These two figures demonstrate their models’ temperaments when given questions on subjects they do not agree on, providing angry (and humorous) responses when presented with subjects they do not like.
Implementation
Our little politicians are based on the Pololu 3Pi platform, making use of it’s built in line following sensor to keep the robot in the test stage. We attached an infrared rangefinder to detect the “questions” and colored LED’s to add additional decorations to the robot.
We laser cut cardboard to create a shell for the 3pi to support the effigies of Clinton and Trump and to create the question pieces. This was a quick and simple method to create the pieces we wanted. After cutting them out and gluing the printed pictures for the characters on, they were simply attached to the top of the 3pi. For a future iteration, we would have used Acrylic instead. We also created a “stage” out of a cardboard sheet and a printed background of the stage from the first debate to showcase the robots.
Credits:
Kai: design of laser cut shells, stage, and question pieces, video narration and editing
Kevin: robot code, video script and filming
Teddy: robot anger states, assembly of robots, video script and filming, documentation compilation
CAD files for robot shells here
Robot code here
]]>Credit to:
Shuli Jiang, Adriel Luo
September~October 2016
We want to build an interesting human-interaction enabled robot but do it in a non-traditional way that only rewards people who are watching it patiently and who have invested time and energy doing so.
Our idea developed from a motionless rock that does nothing when people can see it but moves when no one notices it (i.e. in the dark). However, a rock, a lifeless object, is somewhat tedious. Therefore, we came up with our second model, an Anglerfish. Like the rock, an Anglerfish lives and moves in a dark environment. With a model based on a living creature, we can explore more of the robot’s behavior and outlook. But Anglerfish often gives people the impression of ferocity, which we don’t want our robots to have. We want a robot that lives and sings blithely in the darkness and is kind of coward sometimes. So we developed our third and final model—-a Reaper Fish, an imaginary fish. The robot interacts with people through a light sensor and a distance sensor. The Reaper Fish reacts to obstacles and light through the distance sensor and the light sensor respectively.
A Reaper Fish is a famous creature living in the deepest, darkest parts of the sea. It sweeps the sea floor to harvest parts from dead fish for food and growth. When it finds a part, it assimilates that part into itself. Over time, the new part becomes a part of the Reaper Fish. Not only does the Reaper Fish consume the dead parts from other fish for nutrition, but it also preserves the fresh organs and makes them functional again. It learns how to make use of the functional organs and therefore acquires new abilities from the dead fish.
A Reaper Fish adapts itself to a totally dark environment by two red luminescent eyes that help it communicate with the other Reaper Fish, and a special mouth that receives and sends ultrasonic waves. Sound reflects a Reaper Fish’s mood. When it is happy, it makes sweet tunes. When it is in danger, it screams and its tail will become lightened. Because it lives in the dark, light is lethal to the Reaper Fish. When a Reaper Fish senses light, it reacts violently to light, then freezing and becoming motionless, like turning to stone.
Reaper Fish Vertical View
Reaper Fish Front View
Platform:
Pololu 3pi robot development kit in Arduino IDE
Hardwares:
Pololu 3pi Robot x1, small breadboard x1, 2.2v red LED x2, 2.2v white LED x1, 3k-ohm resistor x1
Sensors: light sensor x1, distance sensor x2 (one for decoration only and one for actual purpose), motion sensor x1(for decoration), ultrasonic sensor x1(for decoration)
Laser Cut Board x2
https://github.com/11hifish/ReaperFish.git
Reaper Fish Double Shell: The bottom transparent shell is to hold the breadboard and the outer dark shell serves as the actual outlook of the Reaper Fish
Close Look
Shuli Jiang: Programming, Robot’s behavior development, Video Narration, Documentation.
Adriel Luo: Laser cutting, Distance sensor incorporation, Robot’s music development, Video edition.
]]>
In this project, we — Lucas Ochoa, a junior studying design, and Scott Donaldson, a master’s student in computational design — create a piece that simulates the suspense, tension, and psychological stresses of Russian roulette.
While the focal point of the piece is the 3d printed physical form resembling an oversized revolver chamber, we also significantly altered the environment and experience around the object in order to heighten the drama. Rather than introducing and describing the piece in a speech, we distributed a bureaucratic form to all the participants containing a disclaimer as well as possible “dangers” of the piece (both real and fantastical). Printed in Courier New and featuring an unidentifiable, corporate-seeming logo, the pseudo-legalese of the form served to distance our selves from the project — to make it appear the work of some faceless corporation with questionable motives and little to no regard for the actual safety of the participants. The date and time were hand-written in the blanks on the page — even corporations are prone to human error. The form appears below:
After reading the form, participants were asked to wait in a queue outside the room housing the project — the Martha Orringer Conference Room. We were fortunate to have pliable, willing participants for this. Lucas acted as the guard to the room, holding a box containing the marbles which he dispensed to participants before they entered the room.
No cameras were permitted inside the Martha Orringer Conference Room. However, we commissioned the following illustrations to depict what the experience might have been like:
A circuitry diagram may be found at the right.
Although it has been known to induce madness in anyone who views it, the project code is available on GitHub.
]]>