Also, the processing code didn’t work for me at first. I added the code I used at the end of this post, but it would be good to look at the code in the arduino sketch since it has a good way to draw a line graph.
// Example by Tom Igoe
import processing.serial.*;
Serial myPort; // The serial port
PFont myFont; // The display font
String inString; // Input string from serial port
int lf = 10; // ASCII linefeed
int xPos = 0;
void setup() {
size(600,600);
// You'll need to make this font with the Create Font Tool
//myFont = loadFont("ArialMS-18.vlw");
//textFont(myFont, 18);
// List all the available serial ports:
printArray(Serial.list());
// I know that the first port in the serial list on my mac
// is always my Keyspan adaptor, so I open Serial.list()[0].
// Open whatever port is the one you're using.
myPort = new Serial(this, Serial.list()[0], 9600);
myPort.bufferUntil(lf);
}
void draw() {
background(0);
// get the ASCII string:
if (inString != null) {
// trim off any whitespace:
inString = trim(inString);
/* replace this code with code to draw your data
String[] vals = inString.split(",");
// convert to an int and map to the screen height:
float y = float(vals[0]);
float x = float(vals[1]);
y *= -1;
x = map(x, -100, 100, 0, height);
y = map(y, -100, 100, 0, width);
// draw the line:
stroke(127,34,255);
arc(x-5, y-5, 10, 10, 0, 2*PI);
*/
}
}
void serialEvent(Serial p) {
inString = p.readStringUntil(lf);
}
When the robots first interact with the environment, they only have a basic self-preservation instinct which is to not leave the black rectangle. As they begin to explore their environment, they also make mistakes such as crashing into objects thus changing or destroying their environment. Additionally, outside participants can induce stimuli by shining the robots with a flashlight. The robots can then choose whether they will respond to this stimuli, similarly to the way humans choose to respond to uncontrollable environmental stimuli. Over time, each robot develops a different set behaviors based on its individual experiences in a particular simulation. However, both robots will converge to a set of behaviors in which they respect their environments.
YouTube / arathorn593 – via Iframely
We constructed this project using a Pololu 3pi Robot. We focused on using an ultrasound distance sensor and photoresistors as our sources of input and programmed the robot to learn how to maneuver around obstacles and potentially avoid bright lights. With respect to the design of the robot, we wanted to create a playful, friendly atmosphere to create empathy between the spectators and the robots rather than making them seem like machines.
Documentation:
Vimeo / ivee – via Iframely
]]>YouTube / Piper – via Iframely
]]>Vimeo / Sen.se – via Iframely
]]>