I wanted to create another iteration of the previous drawing robot, this time with three smaller robots that would not only react to their own lines, but also the lines created by the other robots. This interaction would augment the drawings created based on the movement of all three robots in space, and the duration of time they have been running for. The more lines there are, the more frantic the drawing become.
Technical Notes
I used one gearhead motor, used with the DRV8833 motor driver on the light blue beans pwm pins to control direction. This allowed me to wirelessly control the movement of the robots. I also used the QTR-1RC reflectance sensor to check whether the bot passes over a place it has already drawn. I used Rhino to model the box and the arm and colorful acrylic for the parts. A shaft collars on the hinge of the arm allowed for the rotation and a screw held the arm onto the motor.
Schematic and Code
int MB1 = 6; // motor b pin 1
int MB2 = 5; // motor b pin 2
int sensor1 = 2; // change “0” to whatever pin to which you are hooking the sensor. Just this one change allows you to test operation on other pins.
int reflectance;
int arc_size = 10 ////but this should come from pure data…10 is our small arc size, 100 could be our max?? this can all be set in in the pc.scale thing. where 10 is the second to last number and 100 is the last number
void setup() {
pinMode(MB1, OUTPUT);
pinMode(MB2, OUTPUT);
}
void loop() {
reflectance = 1; //initialize value to 1 at the beginning of each loop
pinMode(sensor1, OUTPUT); //set pin as output
digitalWrite(sensor1, HIGH); //set pin HIGH (5V)
delayMicroseconds(15); //charge capacitor for 15 microseconds
pinMode(sensor1, INPUT); //set pin as input
while((reflectance < 900) && (digitalRead(sensor1) != LOW)){ //timeout at 500
// read the pin state, increment counter until state = LOW
++ reflectance; // increment value to be displayed via serial port
// delayMicroseconds(4); //Change value or comment out to adjust value range
}
if (reflectance < 500){
Serial.println(reflectance);} //Send reflectance value to serial display
else {
Serial.println(“T.O.”); //if reflectance value is over 500 then it’s a “timeout”
}
delay(0);
Serial.begin(9600);
doForward(MB1, MB2); // motor B forward
delay(arc_size);
doStop(MB1, MB2);
delay(0);
if (reflectance > 200) {
doBackward(MB1, MB2); //motor B backward
delay(arc_size);
doStop(MB1, MB2);
delay(0);
}
}
void doForward(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, HIGH);
}
void doStop(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, LOW);
}
void doBackward(int pin1, int pin2) {
digitalWrite(pin2, HIGH);
digitalWrite(pin1, LOW);
}
Plan: For this project I plan to make a set of clip on drawing apparatuses that will activate an object they are attached to, turning anything into a drawing machine. Each set of clips will have a sensor, and will react to both their own lines and the lines created by other active drawing robots. The user will then be able to experiment and create drawings by attaching the clamps to various objects and observing the effect different shapes and weights have on the marks created.
]]>
Designer: Aditi Sarkar, Integrator: Claire Hentschker, Tutor and Scribe: Alice Borie
We wanted to create an autonomous drawing robot that altered its drawing in response to lines it had already created. To do so, we created a box on wheels with an arm that made marks on the table with a dry erase marker. There was a sensor at the tip of the marker, and when it sensed a line had been made in that area, it reversed its own direction temporarily in order to change the pattern being drawn.
Technical details: We used one gearhead motor, used with the DRV8833 motor driver on the arduino’s pwm pins to control direction. We also used the QTR-1RC reflectance sensor to check whether the bot passes over a place it has already drawn. The arduino code can be found here: https://github.com/aditisar/drawingbot/blob/master/drawingbot/drawingbot.ino
Fabrication details: We used Rhino to model the box and the arm. We used colorful acrylic, which gave it a toy like appearance and used zip ties to hold the motor to the box. We used shaft collars on the hinge of the arm and screwed the arm onto the motor.
First iteration: We didn’t have the sensor working/connected to anything – for our first demo, we just had a back and forth motion with the motor for testing puerposes to see how the robot would handle the moving arm. We ended up liking the spirograph like patterns it created and decided to keep it for the second iteration. One of the big problems with the first iteration was our inability to test it without taking it apart, so we made some changes to make testing easier for the second iteration.
Second iteration: We attached the sensor, added an on/off switch, and a hole for the mini usb to plug into the arduino. During the actual demo, we had battery issues that made the robot pause and start in unpredictable intervals. We liked this behavior and the child-like behavior of the robot, and will work it into the final iteration.
]]>
Roles: Claire as Designer and Scribe, Miles as Integrator and Tutor
Introduction
feral(fîr’əl, fěr’-) Existing in a wild or untamed state, either naturally or having returned to such a state from domestication.
An exploration of the auditory ramifications of a feral chair.
Behold the feral chair. The once domesticated office chair has sprung free from the grips of the mundane and returns to the wild. It can not be redomesticated. If man attempts to mount the chair its agony is audible to all.
Sound
Technical
We are using the accelerometer data from a light blue bean to wirelessly map a series of sounds to its movement. These sounds are then being played through a wireless speaker.
]]>
Technical Notes
We covered swimming goggles in frosted paint, and then added a half a pingpong ball with two rgb led lights in each side to both eye holes. The rgb input for these lights is controlled by an arduino uno, which is determining where in the gradient between blue and pink the lights should be, depending on the input to the wind sensor. By breathing on the wind sensor in the nose, the arduino gradually turns the lights more pink. With no breath on the wind sensor, the lights return to blue.
]]>Group Members: Lauren Valley, Claire Hentschker, Tian Jin
Roles: Claire Hentschker as the Scribe, Tian Jin as the Integrator, Lauren Valley as the Designer
Our Video
Introduction
A student, overworked and under slept, attempts to study for her exams. Drowning in a sea of paper notes, the student tries to close her notebook to get some much needed sleep. Unfortunately, The Blenderhelper will not allow that. The second the notebook closes, a blender is turned on and begins shredding the student’s notes. Startled, the student reopens the book and tries to keep studying. Every time the notebook is closed, the blender turns on and shreds more of the student’s notes, forcing the student to study forever.
The blender-helper is a machine that is activated every time a notebook is closed, teaching a student that closing a notebook to stop studying will destroy everything they have worked for. For the students studying at CMU and beyond, there is an overarching mentality grounded in stress, encouraging students to spend every waking hour working, because the alternative is failure. This contraption both attempts to provide a practical solution to this problem (an incentive to work to work forever) while also drawing attention to the ridiculousness of this culture of stress.
Technical Notes
A Lipo3.7 Volt battery powers a relay box to move its armature to AC Power when two pieces of copper on the sides of a notebook are put together. This sends AC power to a kitchen blender, turning it on.
Video and Image Documentation
This is a video of our circuit. Two pieces of copper, attached to conductive thread are attached to each side of a notebook. When these two pieces of copper are pushed together, the relay box sends AC power to the blender, turning it on.
In this image, we can see each component, and how it is connected.
]]>