Student Work – Physical Computing https://courses.ideate.cmu.edu/16-223/f2014 Carnegie Mellon University, IDeATe Fri, 11 Aug 2017 21:41:33 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 Final Project – Trio of Drawing Bots https://courses.ideate.cmu.edu/16-223/f2014/trio-of-drawing-bots/ Sat, 13 Dec 2014 12:10:34 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3504 Group: Claire HentschkerIntroduction

I wanted to create another iteration of the previous drawing robot, this time with three smaller robots that would not only react to their own lines, but also the lines created by the other robots. This interaction would augment the drawings created based on the movement of all three robots in space, and the duration of time they have been running for. The more lines there are, the more frantic the drawing become. 

Technical Notes

I used one gearhead motor, used with the DRV8833 motor driver on the light blue beans pwm pins to control direction. This allowed me to wirelessly control the movement of the robots. I also used the QTR-1RC reflectance sensor to check whether the bot passes over a place it has already drawn. I used Rhino to model the box and the arm and colorful acrylic for the parts. A shaft collars on the hinge of the arm allowed for the rotation and a screw held the arm onto the motor.

Schematic and Code

IMG.png

int MB1 = 6; // motor b pin 1
int MB2 = 5; // motor b pin 2
int sensor1 = 2; // change “0” to whatever pin to which you are hooking the sensor. Just this one change allows you to test operation on other pins.
int reflectance;
int arc_size = 10 ////but this should come from pure data…10 is our small arc size, 100 could be our max?? this can all be set in in the pc.scale thing. where 10 is the second to last number and 100 is the last number

void setup() {
pinMode(MB1, OUTPUT);
pinMode(MB2, OUTPUT);
}
void loop() {
reflectance = 1; //initialize value to 1 at the beginning of each loop
pinMode(sensor1, OUTPUT); //set pin as output
digitalWrite(sensor1, HIGH); //set pin HIGH (5V)
delayMicroseconds(15); //charge capacitor for 15 microseconds

pinMode(sensor1, INPUT); //set pin as input
while((reflectance < 900) && (digitalRead(sensor1) != LOW)){ //timeout at 500
// read the pin state, increment counter until state = LOW
++ reflectance; // increment value to be displayed via serial port
// delayMicroseconds(4); //Change value or comment out to adjust value range
}

if (reflectance < 500){
Serial.println(reflectance);} //Send reflectance value to serial display
else {
Serial.println(“T.O.”); //if reflectance value is over 500 then it’s a “timeout”
}

delay(0);
Serial.begin(9600);
doForward(MB1, MB2); // motor B forward
delay(arc_size);
doStop(MB1, MB2);
delay(0);

if (reflectance > 200) {
doBackward(MB1, MB2); //motor B backward
delay(arc_size);
doStop(MB1, MB2);
delay(0);

}
}
void doForward(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, HIGH);
}
void doStop(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, LOW);
}
void doBackward(int pin1, int pin2) {
digitalWrite(pin2, HIGH);
digitalWrite(pin1, LOW);
}



]]>
Final Project – Columbina’s Companion https://courses.ideate.cmu.edu/16-223/f2014/final-project-columbinas-companion/ Thu, 11 Dec 2014 07:29:39 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3424 Group Members

Akiva Krauthamer – Tutor

Ruben Markowitz – Designer and Scribe

Bryan Gardiner – Integrator

Introduction

Most people in the world are fairly familiar with the concept of an actor: a person who stands on the stage and delivers a story in some form. The same group of people is typically familiar with the concept of a robot: a mechanical object that performs a task autonomously. In both of these cases, there is a certain set of rules regarding appearence, movement, tasks, shape, scale, and so on that are generally used to define these objects. In more recent years, the theatre community has begun to accept robots into the theatrical setting, however these adaptations are rarely seamless. In many cases, the actors act as we expect actors to act, and the robot behaves like a robot should. But what happens when we attempt to bring a robot into a theater as an actor? Can it still look like a robot? Can it act like a robot? Can the actors interact with it like a machine?

Columbina’s Companion was an experiment in how to seamlessly integrate a robot into a show. We attempted to merge certain aspects of a classical robot with certain aspects of a classical actor to create a true “robot actor”.

Several ideas for non-robotic form.

Several ideas for non-robotic form.

progress2

Video

Below is a video of Columbina’s Companion’s debut performance

 

Technical Notes

The base of the robot is a Brookestone Rover 2.0

2014-12-08 19.56.30

This allowed us easy mobility of the robot. The tank drive gave the robot the ability to carry weight and move in a more organic way. It also provided a wireless (WiFi) platform, allowing the robot to roam freely around the theater. The mobility of the robot is human controlled, rather than autonomous. This is similar to an actor; the puppeteer (the director) can tell robot (actor) where to go in the physical space.

2014-12-08 19.56.22

The arms of the robot are two 24″ bendy rulers attached to servos so they could bend at will and independently. These arms are one of two main expressive components of the robot. They were also controlled wirelessly, Arduino to Arduino using NRF2L01+ chips, and were controlled by the puppeteer. Future versions may allow this to be autonomous in response to some sort of stimulous. This is similar to an actor developing his or her own emotional responses to the action on stage.

2014-12-08 18.18.41

The lights on the side of the robot are tied in with the arms, but may have similar autonomy in the future.

shell

The shell we created was also a significant design factor. We decided to make a geodesic dome out of paper for the shell. The many facets and faces of this shape, as well as the multidirectionality of it created a mistique about the robot, and geodesica are not a common shape for traditional robots; it is about as far away from traditional humanoid and sci-fi as you can get.

Wiring Diagram

diagram

CODE

All of the Arduino code was stock NRF2L01 code for arduino (sending and receiving). They were communicating via a numerical string value: the numbers 0-180 specified a value on one arm servo, 1000-1180 specified degree on the other servo. Stock NRF2L01 code can be found here.

The control was based on a piece of Python software called RoverPylot. The changes came in the main code <“ps3Rover.py”>. We changed the button configuration to listen to the Microsoft XBox controller we had available. We then changed to code to send the values (the strings of numerical numbers that we wanted) to the treads, and added code that mapped the toggle button values to the servos over the Arduinos.

The final Python code (requires OpenCV and PyGame) are here:

#!/usr/bin/env python

”’
whimsybotv2.py was edited by Ruben Markowitz, Bryan Gardiner, and Akiva Krauthamer.

ps3rover.py Drive the Brookstone Rover 2.0 via the P3 Controller, displaying
the streaming video using OpenCV.

Copyright (C) 2014 Simon D. Levy

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
”’

# You may want to adjust these buttons for your own controller
BUTTON_GDRIVE = 8 # Select button toggle G-Drive
BUTTON_QUIT = 9 # Start button quits
BUTTON_LIGHTS = 0 # Square button toggles lights
BUTTON_INFRARED = 2 # Circle button toggles infrared
BUTTON_CAMERA_UP = 3 # Triangle button raises camera
BUTTON_CAMERA_DOWN = 1 # X button lowers camera
SERIAL_PORT = 8 # Arduino Com Port
# Avoid button bounce by enforcing lag between button events
MIN_BUTTON_LAG_SEC = 0.5

# Avoid close-to-zero values on axis
MIN_AXIS_ABSVAL = 0.1
import rover
import cvutils
import time
import pygame
import sys
import signal
import serial

def _signal_handler(signal, frame):
frame.f_locals[‘rover’].close()
sys.exit(0)

serialSender = serial.Serial(‘COM5’, 57600)

# Try to start OpenCV for video
try:
import cv
except:
cv = None

# Handler passed to Rover constructor
class PS3Rover(rover.Rover):

“””def processVideo(self, jpegbytes):

try:

if cv:

image = cvutils.jpegbytes_to_cvimage(jpegbytes)
wname = ‘Rover 2.0’
cv.NamedWindow(wname, cv.CV_WINDOW_AUTOSIZE )
cv.ShowImage(wname, image )
cv.WaitKey(5)

else:
pass

except:

pass

“””
# Converts Y coordinate of specified axis to +/-1 or 0
def _axis(index):

value = -controller.get_axis(index)

if value > MIN_AXIS_ABSVAL:
return value
elif value < -MIN_AXIS_ABSVAL:
return value
else:
return 0
# Handles button bounce by waiting a specified time between button presses
“””def _checkButton(controller, lastButtonTime, flag, buttonID, \
onRoutine=None, offRoutine=None):
if controller.get_button(buttonID):
if (time.time() – lastButtonTime) > MIN_BUTTON_LAG_SEC:
lastButtonTime = time.time()
if flag:
if offRoutine:
offRoutine()
flag = False
else:
if onRoutine:
onRoutine()
flag = True
return lastButtonTime, flag”””

# Set up controller using PyGame
pygame.display.init()
pygame.joystick.init()
controller = pygame.joystick.Joystick(0)
controller.init()

# Create a PS3 Rover object
rover = PS3Rover()

# Defaults on startup: lights off, ordinary camera
lightsAreOn = False
infraredIsOn = False

# Tracks button-press times for debouncing
lastButtonTime = 0

# Set up signal handler for CTRL-C
signal.signal(signal.SIGINT, _signal_handler)

# Loop till Quit hit
while True:

# Force joystick polling
pygame.event.pump()

“”” # Quit on Start button
if controller.get_button(BUTTON_QUIT):
break

# Toggle lights
lastButtonTime, lightsAreOn = \
_checkButton(controller, lastButtonTime, \
lightsAreOn, BUTTON_LIGHTS, rover.turnLightsOn, rover.turnLightsOff)

# Toggle night vision (infrared camera)
lastButtonTime, infraredIsOn = \
_checkButton(controller, lastButtonTime, \
infraredIsOn, BUTTON_INFRARED, rover.turnInfraredOn, rover.turnInfraredOff)

# Move camera up/down
if controller.get_button(BUTTON_CAMERA_UP):
rover.moveCamera(1)
elif controller.get_button(BUTTON_CAMERA_DOWN):
rover.moveCamera(-1)
else:
rover.moveCamera(0)
“””
# Set treads based on axes
rover.setTreads(_axis(1), _axis(3))
serialSender.write(str(int(abs(_axis(4))*180))+”\n”)
time.sleep(.005)
serialSender.write(str(((180-int(abs(_axis(0))*180))+1000))+”\n”)
time.sleep(.005)

# Shut down Rover
rover.close()

The rest of the RoverPylot library can be downloaded from the above link.

The Send Arduino code used is as follows:

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

//

// Hardware configuration: first MSP430, then ATMega

//

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

unsigned long integerValue;

unsigned long incomingByte;

//

// Topology

//

// Radio pipe addresses for the 2 nodes to communicate.

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

void setup(void)

{

Serial.begin(BAUD);

printf_begin();

//

// Setup and configure rf radio

//

radio.begin();

// This simple sketch opens two pipes for these two nodes to communicate

// back and forth.

// Open ‘our’ pipe for writing

// Open the ‘other’ pipe for reading, in position #1 (we can have up to 5 pipes open for reading)

radio.openWritingPipe(pipes[0]);

radio.openReadingPipe(1,pipes[1]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

//

// Dump the configuration of the rf unit for debugging

//

radio.printDetails();

}

void loop(void)

{

//

// Ping out role. Repeatedly send the current time

//

// First, stop listening so we can talk.

radio.stopListening();

if (Serial.available() > 0) { // something came across serial

integerValue = 0; // throw away previous integerValue

while(1) { // force into a loop until ‘n’ is received

incomingByte = Serial.read();

if (incomingByte == ‘\n’) break; // exit the while(1), we’re done receiving

if (incomingByte == -1) continue; // if no characters are in the buffer read() returns -1

integerValue *= 10; // shift left 1 decimal place

// convert ASCII to integer, add, and shift left 1 decimal place

integerValue = ((incomingByte – 48) + integerValue);

}

}

radio.write( &integerValue, sizeof(unsigned long) );

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

And the receiving Arduino code (the one on the robot itself):

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

#include <Servo.h>

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

Servo myServo1;

Servo myServo2;

unsigned int leftLight=0;

unsigned int rightLight=0;

void setup(void)

{

myServo1.attach(2);

myServo1.write(0);

myServo2.attach(3);

myServo2.write(0);

Serial.begin(BAUD);

printf_begin();

printf(“RF24/examples/pingpair/\n\r”);

radio.begin();

radio.openWritingPipe(pipes[1]);

radio.openReadingPipe(1,pipes[0]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

radio.printDetails();

}

void loop(void)

{

if ( radio.available() )

{

// Dump the payloads until we’ve gotten everything

unsigned long got_time;

bool done = false;

while (!done)

{

// Fetch the payload, and see if this was the last one.

done = radio.read( &got_time, sizeof(unsigned long) );

}

Serial.println(got_time,DEC);

// First, stop listening so we can talk

radio.stopListening();

// Send the final one back. This way, we don’t delay

// the reply while we wait on serial i/o.

if(got_time<181)

{

if(got_time==0)

myServo1.write(1);

if(got_time==180)

myServo1.write(179);

else

myServo1.write(got_time);

analogWrite(5,(got_time*255)/180);

}

if(got_time>999 && got_time<1181)

{

if(got_time==1000)

myServo1.write(1);

if(got_time==1180)

myServo1.write(179);

else

myServo2.write(got_time-1000);

analogWrite(6,((got_time-1000)*255)/180);

}

// Now, resume listening so we catch the next packets.

radio.startListening();

}

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

 

]]>
Final Project – UltraSonic https://courses.ideate.cmu.edu/16-223/f2014/final-project-ultrasonic/ Thu, 11 Dec 2014 05:03:24 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3411 Nkinde Ambalo, Horace Hou

Introduction

For our project, Ultrasonic, we created a device that tries to quantify a sense that we, humans, can’t sense. Using a microphone and a PureData patch we created a device that can sense ultrasonic sound waves in the range of 18kHz to 20kHz. Using this information we are able to tone the frequency of the sound waves to what humans are able to perceive. Also the strength of the signal changes the volume of the sound outputted. Using this information we also wanted to be able to map the what the variety and sources of ultrasonic waves that exist in a persons environment.

Technical Aspects

Our project consisted of a raspberry pi running pure data. The patch running on pure data watched for the plugged in microphone input level to go higher than the usual for background noise. When it did it played sound through the connected output device, sound which was pitched down 6 octaves from the input sound. This allowed anyone to hear sounds that could have been above their threshold of hearing, as the high frequency tone would be below 10,000 hz. The raspberry pi had to have an external usb sound card connected to it so that it could accept both a recording device as well as output to a speaker device. Other than power, no other cables were connected to the Raspberry Pi. In the plan for this device there was also the plans to add a gps device connected to the pins of the raspberry pi, or a bluetooth adapter connected to a smartphone pinging the location of the phone every time high frequency sound was detected.

Photos

20141210_232619

20141210_232534

20141210_232512

Video

]]>
Final Project – Like Me Harder https://courses.ideate.cmu.edu/16-223/f2014/like-me-harder/ Thu, 11 Dec 2014 00:00:37 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3268 What we did

We created a vibrating dildo that stimulates the user based on how many new Facebook likes (s)he has. Here’s how it works:

  1. Attach the dildo’s cord to the computer’s USB port
  2. Insert the dildo into desired orifice
  3. Run the program (dildo.py) on the computer
  4. Enjoy one pulse of vibration for every new like
  5. Then receive 2 seconds of stimulation for each like
  6. Remove the device from the sweet spot
  7. Wait for the next batch of likes* 😉

*Popular ways to build up like count include: posting a good profile picture, adding more friends, posting better status updates, getting a photo with a celebrity, liking more items on your News Feed (friends will likely reciprocate).

Why Facebook likes? Why a dildo?  

What does it mean to be social? With the continuously increasing popularity of social media, people have the opportunity to be more connected than they ever have been. But does hyperconnectivity directly translate to better social relationships?

Sites like Facebook have created a platform for people to share details of their lives with other users. “Liking” another user’s posts is one of the ways to respond to the information that is “shared” with us. This form of feedback has shown to be of great importance to some users of Facebook. The number of likes on a particular Facebook post is sometimes seen as an indicator of success for that particular post. After all, the number of likes is a quantitative measure for how “liked” that post is. This number can often evoke either gratification or disappointment from publisher of that post. We found this phenomenon to be very interesting and thought-provoking, which is why we isolated it and drew the sexual connection.

Video

[Video coming soon]

How the magic happens

There are three major components that went into the making of our dildo: hardware, software, and fabrication.

Hardware:

The circuit consists of a Teensy 2.0 microcontroller, an N-Channel MOSFET, an 11V LiPo Battery, and a DC motor with a small weight attached to one side of the rotating shaft. When this off-balanced motor is powered it produces vibration to stimulate the genitalia. See the schematic below for more information:

Vibrator

Software:

There are two programs running simultaneously. On a computer, there is a python script running that checks a Gmail account for unread Facebook like notifications. The Gmail account used in this code is set to receive all “like” notifications from a Facebook account. The script then logs into this Gmail account, counts the number of unread notification emails, sends that number to the microcontroller in the dildo, and then marks all of these new notifications as “Read” so that no likes will be double counted. This script uses the Pyserial module to communicate with the microcontroller.

import imaplib
import serial

user = 'dildie69@gmail.com'
password = '**********' #real password not shown
host = imaplib.IMAP4_SSL('imap.gmail.com','993')

ser = serial.Serial('/dev/tty.usbmodem12341', 9600, timeout=0.25)

#sign in to gmail
def enterGmail():
    host.login(user, password)

#check email for notifications and mark as read
def extractNotifs():
    host.select()
    host.search(None, 'UnSeen')
    emailList = host.search(None, 'UnSeen')[1][0].split()
    notifs = len(emailList)
    #mark all 'unseen' as 'seen'
    for email_id in emailList: #comment out for debugging
        markAsRead(email_id)
    return notifs


def markAsRead(email_id):
    host.store(email_id, '+FLAGS', '(SEEN)')


#send number of notifs to dildo as a string
def sendToDildo(notifs):
   if ser:
   ser.write(notifs)
   ser.flush()
   ser.close()

def executeAll():
    enterGmail()
    notifs = extractNotifs()
    if notifs > 0:
        sendToDildo(str(notifs))

executeAll()

The above code communicates with the Teensy 2.0 which runs the following code in order to execute the stimulation:

const int motorPin = PIN_B0;

int motorState = LOW; //motor is off
int notifs = 0; //notifications

void setup(){
    // initialize motor pin as output
    pinMode(motorPin, OUTPUT);
    Serial.begin(9600);
}

void motorOn(){
    digitalWrite(motorPin, HIGH);
}

void motorOff(){
    digitalWrite(motorPin, LOW);
}

void pulseVibe(int pulses){ 
    //pulse once for each notification
    for (int curPulse = 0; curPulse < pulses; curPulse++){
        // define pulse as 0.3 seconds on then 0.6 seconds off
        motorOn();
        delay(300);
        motorOff();
        delay(600);
    }
}

void steadyOn(int n){
    int timePerNotif = 2000; // 2 seconds of stimulation per notif
    int stimTime = (timePerNotif*n);
    motorOn();
    delay(stimTime);
}

void stimulate(){
    pulseVibe(notifs); //pulse once for each notification
    steadyOn(notifs); //stimulate steadily for specified time
    motorOff(); 
}

void updateNotifs(){
    // setup communication with computer
    // the next 6 lines are modified from an online tutorial at:
    // https://github.com/PunchThrough/FacebookFlagger/blob/master    // /Facebook_Flagger_Sketch/Facebook_Flagger_Sketch.ino
    char buffer[64];
    size_t readLength = 64;
    uint8_t length = 0;
    length = Serial.readBytes(buffer, readLength);
    if (length > 0){
        for (int i=0; i<length; i++){
            if (buffer[i] != 0){
                int rawSignal = int(buffer[i]);
                int offset = int('0');
                int newNotifs = rawSignal - offset;
                notifs = notifs + newNotifs;
            }
        }
    }
}

void loop(){
    // Once dildo is connected to computer, check for new likes
    // and stimulate accordingly.
    // Notifications will reset after first loop (see dildo.py)
    updateNotifs(); 
    stimulate();
}

Fabrication:

We thought it would be appropriate to create the physical form in the shape the Facebook Like thumbs up icon. We extruded this icon in SolidWorks to make a 3D model, which we filleted on the edges of the thumb to avoid any sharp corners. We translated that positive model into a negative shell that we 3D-printed and used as a mold for casting our first physical prototype.

We had to pay close attention to the organization of our circuitry so that it would fit properly inside of the mold while leaving enough room for the rubber to surround it. Each component of our circuitry except the rechargeable battery needed to be encased so that they would not become saturated with rubber. For this reason, we created an encasement for the protoboard out of InstaMorph moldable plastic. We used this same material to house the battery’s terminals and we made sure that this housing is accessible from the outside of the object for easy recharging. We encased the vibrating motor on one end with a 3D-printed case that extends slightly into the thumb of the dildo. We also enclosed the shaft end of the motor in a cylindrical wooden case to keep rubber and circuitry from interfering with the rotating weight.

We used Poly PT Flex 50 RTV Liquid Rubber for the exterior of the dildo. We mixed both parts of the rubber solution, then poured a thin layer of it in the mold and let it set before carefully placing the circuitry into the mold. We finally covered the rest of the circuitry and filled the mold with the liquid rubber. We let this set for several hours before we extracted the protoype.

 

]]>
Final Project – Computer Vision Sampler https://courses.ideate.cmu.edu/16-223/f2014/cvsampler/ Wed, 10 Dec 2014 23:46:37 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3245 The Team
Kaitlin Schaer & Marc-Daniel Julien

Introduction
In our computer vision music sampler, we sought to create a physical interface that would allow the user to explore a song in a new way. Using computer vision software, a webcam, a projector, and interactive play surface, our sampler allows the user to move through a piece of music and make it their own.

Piece by piece, the user places the acrylic pieces on the play surface. As in a traditional sampler, clips of music will play according to the objects’ positioning. However instead of using a typical rectilinear layout, we have chosen a polar system, to be both stage-like and reminiscent of a record player.

In this prototype we are working with nine song clips which are distinguished by the color and geometry of our pieces. We have chosen to explore the song Come Together by the Beatles for its strong beats, bassline, and vocal hooks, as well as its recognizability.

As a finished product, we imagine that the user would be able to draw from a larger number of clips, from any number of different songs.

Video

Technical Notes

Software

Github Link

Construction

The physical aspects of the computer vision sampler were realized mostly through computer aided design processes. Modeled in rhino, and laser cut from 1/4 in MDF, the structure was designed with several things in mind: accessibility from multiple angles, the need to mount both a camera and projector, and the need to easily adjust the positioning of said equipment. The shapes used to control the samples, and the labels noting the different instruments available, are colored acrylic plexiglass.

Additional Photos

cv2

cv3

cv5

cv4

]]>
Final Project – Whereband https://courses.ideate.cmu.edu/16-223/f2014/final-project-whereband/ Wed, 10 Dec 2014 22:53:03 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3296 Jeffrey Houng

Jesse Klein

Zac Mau

 

We developed the Whereband to help active individuals come together to create a better exercising experience. The Whereband uses a Light Blue Bean to broadcast its GPS coordinates to a database, from that point the GPS coordinates are transmitted to the nearest tracked phone. On the band itself are several Servo motors which apply light pressure in the direction of the closet user. This gives runners or cyclists or etc a physical sensation that guides them to other users, creating a more motivating and social exercising session.

 

Web Code:

django_project

 

Video:

 

Photos:

IMG_7738 copy

whereband2Screen Shot 2014-12-10 at 5.49.34 PM
diagram

]]>
Autonomous Robot Part 3: Ghosty https://courses.ideate.cmu.edu/16-223/f2014/autonomous-robot-part-3-ghosty/ Wed, 10 Dec 2014 20:38:51 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3292 Introduction

Ghosty is an autonomous little robot who is adventurous and loves to explore his surroundings. However, unlike a normal ghost, he doesn’t like to scare people. Instead they actually scare him! In our most current version of Ghosty, his vision line is based on the data of two proximity sensor data reads. The difference between the data of the two proximity sensors determine whether the motors that control the wheels of Ghosty will go forward or backward and what acceleration (bigger difference equals greater speed).

Technical Details

g1

g2

Ghost was set up according to the above circuit diagram. There were two proximity sensors, both powered by the 5 volt pin on the arduino board. These sent readings to the arduino, which checked the difference between the values and turned the robot left or right if there was a significant difference between them. The difference between the proximity sensors caused a certain combination of high and low values to be sent to the motor pins, controlled through an integrated circuit that was able to operate two motors simultaneously.

 

The above circuit diagram is a modified version of the circuit diagram seen below:

GhostyDiagram

Motor Control:

Upon reaching out to Dr.Ali  from prior iteration (Ghosty II), It was decided the DRV 8833 Motor Controller (source: http://www.pololu.com/file/0J534/drv8833.pdf ) would be best to control the motors for Ghosty. After working with Dr.Ali, a program that works with the DRV 8833 Motor controller was made. The Arduino was able to send data to the Motors was using the DRV 8833, which has the ability to drive two motors simultaneously and send signals from the arduino and relay them to the motor.

The way the motor driver works is it reads two different data forms, by sending LOW/HIGH to the pin labeled BIN2, the motor driver would tell whichever motor connected to that pin to move at a slow decay if it is set to HIGH, and fast decay if it is set to LOW. Then it would move at a forward motion at a rate determined by the difference between the data read by the two proximity sensors. If sent data to BIN1 LOW/HIGH, the motor connected to the pin will also move at a slow decay if sent to HIGH and fast decay if it is set to LOW, however it will now move in reverse motion based on the difference between the data recorded from the two proximity sensors. The same conditions also apply to pins AIN1 and AIN2. BIN1 or BIN2 (as well as if AIN1, AIN2) is fed a positive value (determined by if the proximity sensor connected to A1 on the arduino reads that an object is further away from it than it is the proximity sensor connected to A0) and a negative value if A0 reads an object is further away from it than it is the proximity sensor connected to A1. If fed a negative value then the motor will move in a reverse motion and if positive, will move in a forward motion. The combination of the motors determines which direction Ghosty will go. The motors themselves (from prior iterations in Ghosty II), Solarbotics GM9 Gear Motor (http://www.pololu.com/product/188/resources  )  required around 9V of power each to show any noticeable change, which, in the current iteration was supplied by battery (Code from prior iteration (Ghosty II) Shown Below):                     ArduinoDistance

 

Servo:

Originally in prior iterations (Ghosty 2) it was decided that the arduino would be controlled by pins 9/10 from the Arduino, such that if a proximity sensor read that an object is close, Ghosty will move its arms (servos) in an upward motion, however it was decided it would be more efficient to use these pins for motors instead. The servos were originally powered by A 5V Step Up Regulator. Code shown below:                ArduinoDistanceWithServo

Puredata:

Research was done in regards to integrate puredata in the 3rd edition of Ghosty using sensors & arduino, however, under given time constraints, it was decided not to.  http://forum.pdpatchrepo.info/search/arduino-sensors.

 

Photos

20141204_152617 (1)

20141204_152543 (1)

Video:

]]>
Final Project – Non-Newtonian Composition https://courses.ideate.cmu.edu/16-223/f2014/final-project-non-newtonian-composition/ Wed, 10 Dec 2014 17:04:43 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3249 Group members/Roles: Aditi Sarkar and Becca Epstein as Tutors, Integrators, Designers and Scribes

Introduction

Our project explores the properties of non-Newtonian fluids, specifically oobleck. We created a theater and manipulated the stage to see the different forms and motions it created in the oobleck. Hand gestures vary the vibrations and act as conductors to the performance. Ideally a mechanism for adding color based on gesture would have been included as well, but it was added manually to track the movement of the oobleck.

Video

Fabrication Notes

After experimenting with the oobleck at different frequencies, we found that it “danced” the most at very low frequencies (20-40Hz). We used a subwoofer speaker so that we could get the most movement output at these frequencies. We fastened a tin box lid to the mouth of the speaker with a single screw through the center of both. The outer structure was made with acrylic and wooden dowels, with a stocking stretched across the top to allow for colored powder to drift down.

Technical Notes

Our project used a subwoofer, amplifier, and a Leap Motion sensor. We used the gesture data from the Leap Motion through PureData to control the frequency and amplitude of the speakers, and a leapmotion app called ManosOSC  (https://apps.leapmotion.com/apps/manososc) to get xyz coordinates on each finger joint. Only finger controlled frequency and amplitude in this iteration, but ideally we would have richer control with more natural/conductor like gestures. We actually found a Leap Motion pd external that calculates useful gesture data apart from just spatial coordinates at http://puredatajapan.info/?page_id=1514, but didn’t have enough time to explore it. Our pure data file can be found here: https://github.com/aditisar/oobleck.

s

Future Iterations

A future iteration of this project could include several speakers with oobleck dancing from stage to stage, with more complex gestures from the conductor. We would like to explore the full capabilities of the Leap Motion  – this version only had movement in the xy plane dictating frequency and amplitude. Although we read pinching and slamming gestures, we didn’t get to map them to other parts of the performance. We would also like to add a controlled mechanism for releasing color into the oobleck.

Photos

gloop2

withCeiling

gloop

colors

aditi

 

]]>
Final Project Sketch – Clip on drawing bot https://courses.ideate.cmu.edu/16-223/f2014/final-project-3c-clip-on-drawing-bot-sketch/ Mon, 01 Dec 2014 22:30:40 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3174 Team: Claire Hentschker

Plan: For this project I plan to make a set of clip on drawing apparatuses that will activate an object they are attached to, turning anything into a drawing machine. Each set of clips will have a sensor, and will react to both their own lines and the lines created by other active drawing robots. The user will then be able to experiment and create drawings by attaching the clamps to various objects and observing the effect different shapes and weights have on the marks created.

 

drawingbot photo 1

]]>
Final Project Sketch – Columbina’s Companion https://courses.ideate.cmu.edu/16-223/f2014/columbinascompanion/ Mon, 01 Dec 2014 18:41:02 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3162
KIC Image 0001

This is the diagram of our actor robot. The base is a Rover 2.0 tank RC robot. The head is a helium balloon. The center arm can bend down when the motor at the base tightens a string running up the arm. Everything is controlled from a game controller connected to a laptop.
10815502_10205238121173900_1538910195_o

 

These are a few alternative design idea’s that we’ve been toying with.

 

]]>