RM – Physical Computing https://courses.ideate.cmu.edu/16-223/f2014 Carnegie Mellon University, IDeATe Fri, 11 Aug 2017 21:41:33 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 Final Project – Columbina’s Companion https://courses.ideate.cmu.edu/16-223/f2014/final-project-columbinas-companion/ Thu, 11 Dec 2014 07:29:39 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3424 Group Members

Akiva Krauthamer – Tutor

Ruben Markowitz – Designer and Scribe

Bryan Gardiner – Integrator

Introduction

Most people in the world are fairly familiar with the concept of an actor: a person who stands on the stage and delivers a story in some form. The same group of people is typically familiar with the concept of a robot: a mechanical object that performs a task autonomously. In both of these cases, there is a certain set of rules regarding appearence, movement, tasks, shape, scale, and so on that are generally used to define these objects. In more recent years, the theatre community has begun to accept robots into the theatrical setting, however these adaptations are rarely seamless. In many cases, the actors act as we expect actors to act, and the robot behaves like a robot should. But what happens when we attempt to bring a robot into a theater as an actor? Can it still look like a robot? Can it act like a robot? Can the actors interact with it like a machine?

Columbina’s Companion was an experiment in how to seamlessly integrate a robot into a show. We attempted to merge certain aspects of a classical robot with certain aspects of a classical actor to create a true “robot actor”.

Several ideas for non-robotic form.

Several ideas for non-robotic form.

progress2

Video

Below is a video of Columbina’s Companion’s debut performance

 

Technical Notes

The base of the robot is a Brookestone Rover 2.0

2014-12-08 19.56.30

This allowed us easy mobility of the robot. The tank drive gave the robot the ability to carry weight and move in a more organic way. It also provided a wireless (WiFi) platform, allowing the robot to roam freely around the theater. The mobility of the robot is human controlled, rather than autonomous. This is similar to an actor; the puppeteer (the director) can tell robot (actor) where to go in the physical space.

2014-12-08 19.56.22

The arms of the robot are two 24″ bendy rulers attached to servos so they could bend at will and independently. These arms are one of two main expressive components of the robot. They were also controlled wirelessly, Arduino to Arduino using NRF2L01+ chips, and were controlled by the puppeteer. Future versions may allow this to be autonomous in response to some sort of stimulous. This is similar to an actor developing his or her own emotional responses to the action on stage.

2014-12-08 18.18.41

The lights on the side of the robot are tied in with the arms, but may have similar autonomy in the future.

shell

The shell we created was also a significant design factor. We decided to make a geodesic dome out of paper for the shell. The many facets and faces of this shape, as well as the multidirectionality of it created a mistique about the robot, and geodesica are not a common shape for traditional robots; it is about as far away from traditional humanoid and sci-fi as you can get.

Wiring Diagram

diagram

CODE

All of the Arduino code was stock NRF2L01 code for arduino (sending and receiving). They were communicating via a numerical string value: the numbers 0-180 specified a value on one arm servo, 1000-1180 specified degree on the other servo. Stock NRF2L01 code can be found here.

The control was based on a piece of Python software called RoverPylot. The changes came in the main code <“ps3Rover.py”>. We changed the button configuration to listen to the Microsoft XBox controller we had available. We then changed to code to send the values (the strings of numerical numbers that we wanted) to the treads, and added code that mapped the toggle button values to the servos over the Arduinos.

The final Python code (requires OpenCV and PyGame) are here:

#!/usr/bin/env python

”’
whimsybotv2.py was edited by Ruben Markowitz, Bryan Gardiner, and Akiva Krauthamer.

ps3rover.py Drive the Brookstone Rover 2.0 via the P3 Controller, displaying
the streaming video using OpenCV.

Copyright (C) 2014 Simon D. Levy

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
”’

# You may want to adjust these buttons for your own controller
BUTTON_GDRIVE = 8 # Select button toggle G-Drive
BUTTON_QUIT = 9 # Start button quits
BUTTON_LIGHTS = 0 # Square button toggles lights
BUTTON_INFRARED = 2 # Circle button toggles infrared
BUTTON_CAMERA_UP = 3 # Triangle button raises camera
BUTTON_CAMERA_DOWN = 1 # X button lowers camera
SERIAL_PORT = 8 # Arduino Com Port
# Avoid button bounce by enforcing lag between button events
MIN_BUTTON_LAG_SEC = 0.5

# Avoid close-to-zero values on axis
MIN_AXIS_ABSVAL = 0.1
import rover
import cvutils
import time
import pygame
import sys
import signal
import serial

def _signal_handler(signal, frame):
frame.f_locals[‘rover’].close()
sys.exit(0)

serialSender = serial.Serial(‘COM5’, 57600)

# Try to start OpenCV for video
try:
import cv
except:
cv = None

# Handler passed to Rover constructor
class PS3Rover(rover.Rover):

“””def processVideo(self, jpegbytes):

try:

if cv:

image = cvutils.jpegbytes_to_cvimage(jpegbytes)
wname = ‘Rover 2.0’
cv.NamedWindow(wname, cv.CV_WINDOW_AUTOSIZE )
cv.ShowImage(wname, image )
cv.WaitKey(5)

else:
pass

except:

pass

“””
# Converts Y coordinate of specified axis to +/-1 or 0
def _axis(index):

value = -controller.get_axis(index)

if value > MIN_AXIS_ABSVAL:
return value
elif value < -MIN_AXIS_ABSVAL:
return value
else:
return 0
# Handles button bounce by waiting a specified time between button presses
“””def _checkButton(controller, lastButtonTime, flag, buttonID, \
onRoutine=None, offRoutine=None):
if controller.get_button(buttonID):
if (time.time() – lastButtonTime) > MIN_BUTTON_LAG_SEC:
lastButtonTime = time.time()
if flag:
if offRoutine:
offRoutine()
flag = False
else:
if onRoutine:
onRoutine()
flag = True
return lastButtonTime, flag”””

# Set up controller using PyGame
pygame.display.init()
pygame.joystick.init()
controller = pygame.joystick.Joystick(0)
controller.init()

# Create a PS3 Rover object
rover = PS3Rover()

# Defaults on startup: lights off, ordinary camera
lightsAreOn = False
infraredIsOn = False

# Tracks button-press times for debouncing
lastButtonTime = 0

# Set up signal handler for CTRL-C
signal.signal(signal.SIGINT, _signal_handler)

# Loop till Quit hit
while True:

# Force joystick polling
pygame.event.pump()

“”” # Quit on Start button
if controller.get_button(BUTTON_QUIT):
break

# Toggle lights
lastButtonTime, lightsAreOn = \
_checkButton(controller, lastButtonTime, \
lightsAreOn, BUTTON_LIGHTS, rover.turnLightsOn, rover.turnLightsOff)

# Toggle night vision (infrared camera)
lastButtonTime, infraredIsOn = \
_checkButton(controller, lastButtonTime, \
infraredIsOn, BUTTON_INFRARED, rover.turnInfraredOn, rover.turnInfraredOff)

# Move camera up/down
if controller.get_button(BUTTON_CAMERA_UP):
rover.moveCamera(1)
elif controller.get_button(BUTTON_CAMERA_DOWN):
rover.moveCamera(-1)
else:
rover.moveCamera(0)
“””
# Set treads based on axes
rover.setTreads(_axis(1), _axis(3))
serialSender.write(str(int(abs(_axis(4))*180))+”\n”)
time.sleep(.005)
serialSender.write(str(((180-int(abs(_axis(0))*180))+1000))+”\n”)
time.sleep(.005)

# Shut down Rover
rover.close()

The rest of the RoverPylot library can be downloaded from the above link.

The Send Arduino code used is as follows:

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

//

// Hardware configuration: first MSP430, then ATMega

//

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

unsigned long integerValue;

unsigned long incomingByte;

//

// Topology

//

// Radio pipe addresses for the 2 nodes to communicate.

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

void setup(void)

{

Serial.begin(BAUD);

printf_begin();

//

// Setup and configure rf radio

//

radio.begin();

// This simple sketch opens two pipes for these two nodes to communicate

// back and forth.

// Open ‘our’ pipe for writing

// Open the ‘other’ pipe for reading, in position #1 (we can have up to 5 pipes open for reading)

radio.openWritingPipe(pipes[0]);

radio.openReadingPipe(1,pipes[1]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

//

// Dump the configuration of the rf unit for debugging

//

radio.printDetails();

}

void loop(void)

{

//

// Ping out role. Repeatedly send the current time

//

// First, stop listening so we can talk.

radio.stopListening();

if (Serial.available() > 0) { // something came across serial

integerValue = 0; // throw away previous integerValue

while(1) { // force into a loop until ‘n’ is received

incomingByte = Serial.read();

if (incomingByte == ‘\n’) break; // exit the while(1), we’re done receiving

if (incomingByte == -1) continue; // if no characters are in the buffer read() returns -1

integerValue *= 10; // shift left 1 decimal place

// convert ASCII to integer, add, and shift left 1 decimal place

integerValue = ((incomingByte – 48) + integerValue);

}

}

radio.write( &integerValue, sizeof(unsigned long) );

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

And the receiving Arduino code (the one on the robot itself):

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

#include <Servo.h>

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

Servo myServo1;

Servo myServo2;

unsigned int leftLight=0;

unsigned int rightLight=0;

void setup(void)

{

myServo1.attach(2);

myServo1.write(0);

myServo2.attach(3);

myServo2.write(0);

Serial.begin(BAUD);

printf_begin();

printf(“RF24/examples/pingpair/\n\r”);

radio.begin();

radio.openWritingPipe(pipes[1]);

radio.openReadingPipe(1,pipes[0]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

radio.printDetails();

}

void loop(void)

{

if ( radio.available() )

{

// Dump the payloads until we’ve gotten everything

unsigned long got_time;

bool done = false;

while (!done)

{

// Fetch the payload, and see if this was the last one.

done = radio.read( &got_time, sizeof(unsigned long) );

}

Serial.println(got_time,DEC);

// First, stop listening so we can talk

radio.stopListening();

// Send the final one back. This way, we don’t delay

// the reply while we wait on serial i/o.

if(got_time<181)

{

if(got_time==0)

myServo1.write(1);

if(got_time==180)

myServo1.write(179);

else

myServo1.write(got_time);

analogWrite(5,(got_time*255)/180);

}

if(got_time>999 && got_time<1181)

{

if(got_time==1000)

myServo1.write(1);

if(got_time==1180)

myServo1.write(179);

else

myServo2.write(got_time-1000);

analogWrite(6,((got_time-1000)*255)/180);

}

// Now, resume listening so we catch the next packets.

radio.startListening();

}

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

 

]]>
1A – Basic Circuits Project – The Anti-Social Robot https://courses.ideate.cmu.edu/16-223/f2014/the-anti-social-robot/ Wed, 10 Sep 2014 07:41:50 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=1309 The Team

Group Members: Annabelle Lee, Jeff Houng, Ruben Markowitz

Roles: Annabelle Lee as the Scribe, Jeff Houng as the Integrator, Ruben Markowitz as the Designer

 

The Video

 

Introduction

Presently, technology is a huge driving force behind progress and human development. Despite all that technology has helped us accomplish, it also plays a different role in our lives. It engrosses us with an endless barrage of information, wanted or not. The Anti-Social Robot serves as a statement about the dark side born when human behavior meets technology.

As each one of us embrace our niche in technology – video games, online shopping, videos, social media – we spend interacting with others in real life. Even as we send each other sociable fragments on the Internet, we’re slowly distancing ourselves from each other.

The Anti-Social Robot mimics and translates this distancing into a literal and observable form. As you approach the little man engulfed with technology, he invariably drifts away from you, trapped alone in his little bubble.

 A rangefinding sensor triggers two motors to move the man away from you.

Technical Notes

The electronic components of the robot consist mainly of a Sharp 2Y0A21 IR rangefinding sensor and a general purpose NPN transistor. The power supply is a LiPo 3.7V battery, which powers the electronics and 2 5v DC motors with built in gear boxes, with a ratio of 143:1, to increase torque. The body of the robot is 1/8″ Masonite for the structure, and 1/8″ PMMA clear plastic for the display. These parts were laser cut to enhance precision.

 

The Image Documentation

This is a photo of our circuit as we were setting up a previous version. From the left, counterclockwise: the sensor (black box), the motors (two small metal cylinders), and the relay (small black rectangle). We did not end up using the relay after altering our concept.

This is a photo of our circuit as we were setting up a previous version. From the left, counterclockwise: the sensor (black box), the motors (two small metal cylinders), and the relay (small black rectangle). We did not end up using the relay after altering our concept.

 

Here, we test the sensors.

Here, we test the sensors.

 

Ruben tinkering with the circuit.

Ruben tinkering with the circuit.

 

We experimented with various motors before finding the right ones.

We experimented with various motors before finding the right ones.

 

A picture of our circuit at the final stage. We changed the motors to more powerful ones because the weaker ones would not propel the robot. On the left, we also have our battery.

A picture of our circuit at the final stage. We changed the motors to more powerful ones because the weaker ones would not propel the robot. On the left, we also have our battery.

 

A picture of the dimensions of the robot. On the left are the dimensions of the bottom carriage, in wood. The semicircles are cutouts for the wheels. The circle is the cutout for the wheel. On the right are the dimensions for the clear shell.

A picture of the dimensions of the robot. On the left are the dimensions of the bottom carriage, in wood. The semicircles are cutouts for the wheels. The circle is the cutout for the wheel. On the right are the dimensions for the clear shell.

 

Pasted onto the technology shell are transparencies with various technologies we access most commonly today. On this side is Facebook and Amazon.

Pasted onto the technology shell are transparencies with various technologies we access most commonly today. On this side is Facebook and Amazon.

 

On this side is a video game and the Apple page.

On this side is a video game and the Apple page.

 

And here, you see the man multitasking, juggling video games, Facebook, and Amazon.

And here, you see the man multitasking, juggling video games, Facebook, and Amazon.

 

Here, you see the man and a video of a car's steering wheel.

Here, you see the man and a video of a car’s steering wheel.

 

The initial concept was to have the robot swerve away from the approaching sociable friend. However, due to technological limitations, we altered our idea to having the robot run away from the friend instead. This change has also slightly altered the meaning of the concept.

The initial concept was to have the robot swerve away from the approaching sociable friend. However, due to technological limitations, we altered our idea to having the robot run away from the friend instead. This change has also slightly altered the meaning of the concept.

]]>