Physical Computing https://courses.ideate.cmu.edu/16-223/f2014 Carnegie Mellon University, IDeATe Fri, 11 Aug 2017 21:41:33 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 Tutorial: Android OSC Communication https://courses.ideate.cmu.edu/16-223/f2014/tutorial-android-osc-communication/ Wed, 17 Dec 2014 16:10:59 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3600 Introduction

Most of the time, there are already existing apps that send readily-available sensor data from your phone to your computer via OSC. But what if you have another device that connects to your phone that you want to use as an extra sensor? You might not be able to find an app that sends the information that you want. For example, you might want your phone to send information from multiple bluetooth devices to your computer (Github repository at bottom). The following steps will show you how to write an Android app that lets you send whatever you want using OSC.

Note: The following tutorial assumes that you have already downloaded Android Studio, and have a basic project set up and running. For more information on how to use Android Studio, follow the tutorials in the training section of the Android developer’s website.

OSC (Open sound control) is a protocol for communicating between different devices. The specification can be found here: OSC specification.

In order to use OSC with Android, we need an implementation in Java. The one we will be using in this tutorial can be found here: JavaOSC.

Including the Library

First, create a new project with the default settings. In your project view, open the build.gradle file in the app folder. You want to add the following lines of code to it:

repositories{
mavenCentral()
}

These few lines tell Android Studio that we will be using a library from the Maven Repository. To actually include the library, add the following line to the dependencies block:

compile 'com.illposed.osc:javaosc-core:0.3'

When you get a message telling you to sync your project, do it and Android Studio will automatically download the libraries that you need for you.

The final build.gradle file can be found here.

Using the Library

In the MainActivity.class file, insert these lines with the rest of your imports:

import java.net.*;
import java.util.*;

import com.illposed.osc.*;

Note: At this point, if you get an error about not finding “illposed,” you haven’t included the libraries properly yet.

Since networking and communication tasks can’t be done on the main thread in Android apps, we need to create a new thread, and put all relevant code there. Then just start the thread in the onCreate() method. The example code can be found here.

Tutorial Github Repository

Github Repository for Kinecontrol Modular Music

]]>
Tutorial: Leap Motion to PureData https://courses.ideate.cmu.edu/16-223/f2014/tutorial-leap-motion-to-puredata/ Tue, 16 Dec 2014 16:39:35 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3579 Introduction 

The Leap Motion is a sensor that can be used to read hand gestures. It provides spatial coordinates from each joint on each finger. If you’re interested in using the Leap Motion sensor with puredata to control sound through gesture, this tutorial will provide a guide for doing so.

Leap Motion Setup: 

Follow the instructions here: https://www.leapmotion.com/setup. You don’t need to get the developer version if you are just using it with puredata, but it could be useful for other projects.

Mac users: 

There is a pd external found here: http://puredatajapan.info/?page_id=1514 that reads the leap motion data and also provides some other useful calculations other than just spatial coordinates (velocities, palm normals, etc). This can be used in any of your own patches. Once it is compiled on your computer, copy the leapmotion-help.pd file, along with gesture.pd, hand.pd, and point.pd into the the same directory as your patch. Create a leapmotion-help object in your patch, and open it up so it is running at the same time.  You’ll notice that the leapmotion-help patch is using the puredata send objects to send data about hands/gestures/general info/tools. In your own patch, you can receive these messages using the receive object with the name of what you want. You can print these messages to see how they are formatted and then use several route and unpack objects to get the exact values you want. An example for getting the velocity of a hand is shown below, and for getting the “first” finger’s velocity.

lmexternal

Alternatives: 

I used the ManosOSC  app available on the leap motion app store (https://apps.leapmotion.com/apps/manososc). This app streams OpenSoundControl data that pd can get through a UDP connection. This uses the mrpeach library just like the tutorials on using the smartphone apps to send OSC data. Just make sure to change the port to port to the same one that the ManosOSC app is sending on. The default is 7110.The example below is for getting the xyz coordinates of the tips of the first two fingers the leapmotion recognizes.

manososc

 

The actual pd file with the code in the screenshots can be found here: https://github.com/aditisar/oobleck/blob/master/leappdtutorial.pd

My final project used ManosOSC and tried to track a finger’s direction to control frequency/amplitude of a speaker. The code for that can be found here: https://github.com/aditisar/oobleck/blob/master/finalproject.pd

]]>
Tutorial: How to Solder Properly https://courses.ideate.cmu.edu/16-223/f2014/how-to-solder-properly/ Mon, 15 Dec 2014 06:43:47 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3536 Introduction

In Intro to Physical Computing, many projects require a skill of soldering. For those who don’t know, soldering is the practice of conjoining two metal materials together by the melting of a third material (solder) that melts and bridges the two metals together. Soldering is used in almost every project in the class. Examples of where soldering is applicable to this class is the joining of wires on a circuit board, of wires to a module, or wires to other wires, to name a few. However, this class having no prerequisites, it is not guaranteed that everyone knows how to solder, or solder properly. This tutorial is to show a beginner how to via soldering two wires together.

Materials

The first thing is to make sure you have all the materials needed to solder. This includes:

Safety Goggles:

20141214_212506

Soldering Iron:

20141214_212329

60/40 Rosin Core Solder Spool:

20141214_212517

Fan:

20141214_224209

Third Hand Soldering Stand with Magnifying Glass & Sponge:

20141214_212456

Wire Stripper:

20141215_013240

 

Wires:

20141215_015121

Safety goggles are important as soldering involves melting metals at a high temperature, so it is imperative to protect your eyes when in the process of soldering. Solder comes in various shapes, lengths, and sizes, for many different applications. It is very important that the solder you are using Rosin Core Solder as it has rosin flux which is used to help the metals from oxidizing, which tends to happen at high temperatures. With oxidation, it is almost impossible for you to be able to solder properly. It is also ideal to use 60/40 solder, which is used commonly in applications that are used in Physical Computing such as soldering electronics. The one I am using is Rosin Core Solder from Radioshack® There are many versions of soldering irons out there that have different Power requirements for different applications. The soldering iron tool from Weller®, is a 25 Watt single temperature soldering iron that heats up to 750 degrees Fahrenheit. This is the typical power needed for a soldering iron to melt solder that is used for applications such as electronics and circuitry work. Although it only heats to a single temperature, it is effective in melting the Rosin Core Solder. Offered in class is the Hakko® FX888d soldering station, that gives the user a wider range of temperatures, allowing the user to heat up the soldering iron up to almost 900 degrees, although it isn’t recommended to.

hakko_fx888d_solder_station

 

The third hand station can be very useful when Soldering. It can hold the soldering iron as well as up to two metal parts (be it wires or circuit boards) through its two alligator clips. (Note: be careful if using the alligator clips to hold a circuit board, as it is possible to damage the board doing so). The sponge at the bottom is necessary to clean off the soldering iron after soldering so it can be used again. The magnifying glass, is also important so one can have a close up view when soldering for better accuracy. I got mine also from Radioshack® Some stations, such as the Hakko used in Physical Computing, have flux installed into the station as well, this way after soldering, the user can simply rub the soldering iron tip into the flux so it can be used again with less fear of the tip oxidizing. It is also recommended to have either a Fume Extractor (provided in Physical Computing) or a fan when soldering. The fumes made from the melting of solder can be very dangerous if inhaled for extended periods of time.

Soldering Tip(s):

Almost all soldering irons have removable tips so that the user can use which soldering tip that bets suits them. I personally use the round cone tip when soldering, however, especially for beginners, it is heavily recommended to use a flat head tip for soldering. It gives the user more surface area on the soldering iron, when soldering.

20141214_212339

 

20141214_215904

Setting Up:

It is important to remember to wet the sponge on the soldering base, although it’s easy to remember, it is also easy to forget to wet the sponge, as placing the soldering iron on a dry sponge will cause the sponge to burn and shorten its lifespan.

No:

20141214_220843

Yes!:

20141214_220941

 

The next step is to take the two wires we want to solder together, and strip them using a wire stripper. This will give us more space to put solder on the wire, thus creating a larger bond between the two wires

Now simply place the wires into the two alligator clips of the third hand stand, one wire to each alligator clip. Then adjust the third hand stand such that the wires are touching each other. Before soldering, an important step is the process called tinning. Contrary to the name, the only time solder is actually placed on the soldering iron, is this step. In the process of tinning, you want to turn on your soldering iron, in this case plug it in, and wait 60 seconds for it to reach the target temperature. Then, you want to take some solder from your spool of solder, and gently rub the solder onto the soldering iron, but only enough to cover the tip, you do not want globs of solder to form onto the soldering iron. The process of tinning is to reduce the chance of oxidation when soldering as well as to help make the soldering process quicker. Then, once this is done, Place the soldering iron below the exposed section of the wires and the solder above. This will create a heat bridge in which from the soldering iron heating up the wires, the wires will then cause the solder to melt on them. Then move both the soldering iron and solder slowly from left to right to completely cover the exposed section in solder. Use the magnifying glass as help if necessary for a closeup view. It should go as seen in the video below:

If done correctly, the result will look like this:

20141214_230026

As an added bonus, I also added how to solder on a circuit board. This circuit board being the Personal Portable Server Prototype of my Father and I’s startup, Toyz Electronics. This video is a demonstration of me (re)soldering the battery onto the board using the steps above. (for more information on the Personal Portable Server: https://www.youtube.com/watch?v=hbpiUyZQ3nM  or go to tngl.biz)

Final Product:

20141214_233414

Thank you:
Video recorded on my Wifi Smart Glasses, a product of Toyz Electronics

]]>
Tutorial: Pixy (CMUcam5) https://courses.ideate.cmu.edu/16-223/f2014/tutorial-pixy-cmucam5/ Mon, 15 Dec 2014 04:49:16 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3541 This tutorial will focus on the correct way to set up and receive information from a Pixy using an Arduino Uno.  Please note that cmucam.org also provides instructions for the Pixy in the Pixy Quick Start Guide on their WIKI.

Set-Up

Powering the Pixy

Ideal:

Pixy can be powered through the Arduino cable provided, just plug in the cable and your pixy should be good to go!

Alternatives:

If, for some reason you decide to power the Pixy through the Vin and GND pins, be careful, these are not reverse polarity protected so you can kill the Pixy if you get the pins backwards.

Other options for powering Pixy are detailed here.

Starting up the Pixy

Apply power to Pixy to start it up.  After this one two things will happen:

If the pixy is configured to “recognize” an object, it will flash red every time that object is recognized and the pixy will send this data to the Arduino.

If the pixy wasn’t configured to recognize any object it will just go into an idle state. If you want to set a new signature, see “Programming” the Pixy.

“Programming” the Pixy

Ideal:

Unless you’re trying to recognize extremely complex objects and patterns, I recommend just configuring the pixy without the use of a computer.

Once the pixy is on, press and hold the button until the RGB led starts blinking in different colors. Once the seizure session stops, place the object that you want the pixy to recognize right in front of the camera (the led should be the color of the object at this point) and press the button (don’t hold it down). If the LED stops flashing, pixy has recognized your object and is now configured to recognize that object! If not, try again.

Alternative:

The pixy can also be loaded with signatures to recognize using the CMUcam5 software. This method may or may not work properly, depending on the firmware you have installed on the pixy. Open up pixymon, plug in the pixy, and highlight the object you want your pixy to recognize. If you want to clear or set new signatures, you can do this all through pixymon.

Configuring the Arduino

Once you have hooked up the pixy to the Arduino and configured the pixy to recognize an object, all you have to do is write the proper code to interpret the data from the pixy.

Install the Pixy Library

Download the pixy library from this link:

http://www.cmucam.org/attachments/download/1054/arduino_pixy-0.1.3.zip

In the Arduino IDE, import the newly downloaded library from Sketch->Import Library.

Writing Code

We will be using the pixy API to work with the data sent over from the pixy. The pixy API makes it extremely convenient to parse the data sent over from the pixy to the Arduino.

Data is received in “blocks”, where each block represents properties of an object that was detected. Blocks can tell you how big the object is on the screen, where it is and other useful information.

Here’s a quick guide to get you started:

Start by including the headers for pixy and SPI libraries:

#include <SPI.h>

#include <Pixy.h>

Declare a pixy global like so:

Pixy pixy;

In your setup function, initialize the pixy global:

pixy.init();

In your loop function you’re going to want to poll the pixy for recognized objects like so:

  uint16_t blocks; //size of array pixy sends

  blocks = pixy.getBlocks();

  if (blocks) // if there were any recognized objects

  {

    //finds the largest object that fits the signature

    for (j = 0; j < blocks; j++){

      prod = pixy.blocks[j].width * pixy.blocks[j].height;

      if (prod > maxProd) //save the new largest obj

          maxJ = j;

    }

This sample code goes through all the “blocks” (which represent objects that are found) and finds the largest block that fits the signature. This is done by checking the blocks area by multiplying the width and height. Blocks also hold more information that you can check, here’s a full list:

  • pixy.blocks[i].signature
    • signature of the object, from 1 to 7
  • pixy.blocks[i].x
    • x coordinate of the object, from 0 to 319
  • pixy.blocks[i].y
    • y coordinate of the object, from 0 to 199
  • pixy.blocks[i].width
    • width of the object, from 1 to 320
  • pixy.blocks[i].height
    • height of the object, from 1 to 200
  • pixy.blocks[i].print()
    • prints the objects received, to the serial port makes debugging easy

If all went well, your arduino should be correctly finding the proper largest object that was picked up by the pixy. To investigate code that was written for Bull, just go to the link:

https://github.com/kaandog/Bull/blob/master/Bull.ino

]]>
Tutorial: nrf24l01+ https://courses.ideate.cmu.edu/16-223/f2014/tutorial-nrf24l01/ Mon, 15 Dec 2014 03:19:40 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3530 Introduction

The nrf24l01+ is often the best solution for close range communication between arduino based devices. They are extraordinarily cheap, do not require a wireless network, and any number of them can talk to eachother at once. However, there are limitations that should be considered when using them. They have a range of around 300 feet and should always be under that to ensure good use. They can send more complex signals, but there is no api for using them, so anything more than just sending numbers between them will take more time. If these limitations fit your project, however, the nrf24l01+ is a great option.
For more information about this component, here is a datasheet

Set Up

To connect the nrf24l01+ to an arduino, there are very specific pins that are required. This is due to the serial communications that are required. Fortunately, most of the serial is abstracted away during code, but the following pins must be hooked up to the same pins on your arduino; MOSI, SCK and MISO. Vcc and Gnd are connected to 3.3v and ground, CE and CSN can be attached to any pin (we will default to 9 and 10 respectively) and IRQ is not used when communicating with arduinos. For this tutorial we will be using an arduino uno, so the pins we connect will be based on that, but remember you will need to look at your specific arduino’s pinout if you wish to use a different arduino

Gnd-Gnd

Vcc-3.3v

CE-d9

CS-d10

SCK-d13

MOSI-d11

MISO-d12

You will also need to download and install the following library https://github.com/jscrane/RF24

 

 

Test it out

To test this project, simply set up two different arduino’s with nrf24l01+s and run the pingpair code found in the github repository. You will need two different computers to read the different serial lines, and if done right one should be pinging and one should be ponging. (One of the arduinos should have their 7th pin ground to set it to a different mode than the other one).

Go further

Included is more sample code, created for a final project for Physical Computing. This code is easier to edit, and involves two different sections. The servoin code waits for a number between 0 and 180 and sets a servo to that value, and also waits for a number between 1000 and 1180 and sets a second servo to that value. The servoout code waits for a number to be written to its serial and sends that number to the other arduino. This allows for remote access of an arduino from a computer (which can be expanded to include python code controlling a remote arduino, for example.)

Note: This code also requires printf.h, found in the pingpong example used earlier

ServoIn

#include <SPI.h>
#include <RF24.h>
#include "printf.h"
#include <Servo.h>
#define CE 9
#define CS 10
#define ROLE 7
#define BAUD 57600
RF24 radio(CE, CS);
const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };
Servo myServo1;
Servo myServo2;
unsigned int leftLight=0;
unsigned int rightLight=0;
void setup(void)
{
  myServo1.attach(2);
  myServo1.write(0);
  myServo2.attach(3);
  myServo2.write(0);
  Serial.begin(BAUD);
  printf_begin();
  printf("ServoIn\n\r");
  radio.begin();
  radio.openWritingPipe(pipes[1]);
  radio.openReadingPipe(1,pipes[0]);
  radio.enableDynamicPayloads() ;
  radio.setAutoAck( true ) ;
  radio.powerUp() ;
  radio.startListening();
  radio.printDetails();
}
void loop(void)
{
  if ( radio.available() )
  {
    unsigned long got_time;
    bool done = false;
    while (!done)
    {
      done = radio.read( &got_time, sizeof(unsigned long) );
    }
    Serial.println(got_time,DEC);
    radio.stopListening();
    if(got_time<181)
    {
      if(got_time==0)
        myServo1.write(1);
      if(got_time==180)
        myServo1.write(179);
      else
        myServo1.write(got_time);
      analogWrite(5,(got_time*255)/180);
    }
    if(got_time>999 && got_time<1181)
    {
      if(got_time==1000)
        myServo1.write(1);
      if(got_time==1180)
        myServo1.write(179);
      else
        myServo2.write(got_time-1000);
      analogWrite(6,((got_time-1000)*255)/180);
    }
  radio.startListening();
  }
}
// vim:cin:ai:sts=2 sw=2 ft=cpp

ServoOut

#include <SPI.h>
#include <RF24.h>
#include "printf.h"
#define CE	9
#define CS	10
#define ROLE	7
#define BAUD	57600
RF24 radio(CE, CS);
unsigned long integerValue;
unsigned long incomingByte;
const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };
void setup(void)
{
  Serial.begin(BAUD);
  printf_begin();
  radio.begin();
  radio.openWritingPipe(pipes[0]);
  radio.openReadingPipe(1,pipes[1]);
  radio.enableDynamicPayloads() ;
  radio.setAutoAck( true ) ;
  radio.powerUp() ;
  radio.startListening();
  radio.printDetails();
}
void loop(void)
{ 
    radio.stopListening();
    if (Serial.available() > 0) {   
    integerValue = 0;         
    while(1) {            
      incomingByte = Serial.read();
      if (incomingByte == '\n') break;   
      if (incomingByte == -1) continue;  
      integerValue *= 10;  
      integerValue = ((incomingByte - 48) + integerValue);
    }
    }
   radio.write( &integerValue, sizeof(unsigned long) );
}
// vim:cin:ai:sts=2 sw=2 ft=cpp

]]>
Final Project – Trio of Drawing Bots https://courses.ideate.cmu.edu/16-223/f2014/trio-of-drawing-bots/ Sat, 13 Dec 2014 12:10:34 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3504 Group: Claire HentschkerIntroduction

I wanted to create another iteration of the previous drawing robot, this time with three smaller robots that would not only react to their own lines, but also the lines created by the other robots. This interaction would augment the drawings created based on the movement of all three robots in space, and the duration of time they have been running for. The more lines there are, the more frantic the drawing become. 

Technical Notes

I used one gearhead motor, used with the DRV8833 motor driver on the light blue beans pwm pins to control direction. This allowed me to wirelessly control the movement of the robots. I also used the QTR-1RC reflectance sensor to check whether the bot passes over a place it has already drawn. I used Rhino to model the box and the arm and colorful acrylic for the parts. A shaft collars on the hinge of the arm allowed for the rotation and a screw held the arm onto the motor.

Schematic and Code

IMG.png

int MB1 = 6; // motor b pin 1
int MB2 = 5; // motor b pin 2
int sensor1 = 2; // change “0” to whatever pin to which you are hooking the sensor. Just this one change allows you to test operation on other pins.
int reflectance;
int arc_size = 10 ////but this should come from pure data…10 is our small arc size, 100 could be our max?? this can all be set in in the pc.scale thing. where 10 is the second to last number and 100 is the last number

void setup() {
pinMode(MB1, OUTPUT);
pinMode(MB2, OUTPUT);
}
void loop() {
reflectance = 1; //initialize value to 1 at the beginning of each loop
pinMode(sensor1, OUTPUT); //set pin as output
digitalWrite(sensor1, HIGH); //set pin HIGH (5V)
delayMicroseconds(15); //charge capacitor for 15 microseconds

pinMode(sensor1, INPUT); //set pin as input
while((reflectance < 900) && (digitalRead(sensor1) != LOW)){ //timeout at 500
// read the pin state, increment counter until state = LOW
++ reflectance; // increment value to be displayed via serial port
// delayMicroseconds(4); //Change value or comment out to adjust value range
}

if (reflectance < 500){
Serial.println(reflectance);} //Send reflectance value to serial display
else {
Serial.println(“T.O.”); //if reflectance value is over 500 then it’s a “timeout”
}

delay(0);
Serial.begin(9600);
doForward(MB1, MB2); // motor B forward
delay(arc_size);
doStop(MB1, MB2);
delay(0);

if (reflectance > 200) {
doBackward(MB1, MB2); //motor B backward
delay(arc_size);
doStop(MB1, MB2);
delay(0);

}
}
void doForward(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, HIGH);
}
void doStop(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, LOW);
}
void doBackward(int pin1, int pin2) {
digitalWrite(pin2, HIGH);
digitalWrite(pin1, LOW);
}



]]>
Final Project – Tech Tunnel Vision https://courses.ideate.cmu.edu/16-223/f2014/final-project-tech-tunnel-vision/ Thu, 11 Dec 2014 21:02:27 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3476 Introduction 

Rachel Ciavarella & Joe Mallonee

People love technology, and people love hearing about how successful people got to where they are: a perfect inspirational match is found in successful technology leaders. Everyone’s journey and path is different, but it can be incredibly tempting to try to follow the advice and direct life experiences of successful people. We wanted to embody this is by playing with the relationship between the audience, the participant, and people who have achieved a perceived level of immense success. We saw Bill Gates, Elon Musk, and Mark Zuckerberg as modern day oracles of tech non-sense, and coupled that with the tunnel vision Millenials have towards “tech.”

Our project was designed for a dark and quiet room. We planned to have a group of people walk in and see a singular, floating, telescopic helmet glaring directly into a wall. A lone participant would ascend the steps, and peer into a future as told by Bill, Elon, and Mark. When they entered the helmet their Newsfeed (a generic one for the purposes of our demonstration) would begin to scroll, illuminating the audience while the participant was unaware. We wanted the audience to be confused about whether something was supposed to be happening or not, to wonder what the singular person was seeing, and in all honesty be underwhelmed and somewhat unwilling participants. For the individual in the helmet we wanted them to feel confused as well, to struggle to decipher what was being said by the three people in a humorous way. We aimed to place speeches of famous tech figures in a different context so they could experience the detectable absurdity in their stories at least in relationship to the viewer’s life.

Technology 

We began the project by using facial recognition based on OpenCV and run through Processing to assess the viewer’s focus: the more directly they studied the screen the faster the Newsfeed outside would scroll. The audio would begin to warp if the viewer was not intensely centered towards the screen. Due to time constraints we decided to simplify this part of the experience, and used an IR sensor to detect the presence of the viewer in the helmet and begin playing the video. We added an additional IR sensor to control the floor projection at the same time.

The helmet itself is constructed over a bike helmet with sheets of styrene. Because of the weight and alignment with the projection, we suspended the helmet with a series of ropes. We ended up using two computers, two Arduinos, and two Processing sketches. If we created a second iteration we would condense all of this into one computer, one Arduino, and one Processing sketch. We’d also build custom steps, and find a way to support the helmet in a less visible and subtle way. We would also insist on the ideal room and conditions for our projects.


Content

tech_tunnelvision_schem

 

IR_Helmet On_Off_Processing

 

]]>
Final Project Sketch – Tech Tunnel vision https://courses.ideate.cmu.edu/16-223/f2014/final-project-sketch-tech-tunnel-vision/ Thu, 11 Dec 2014 20:59:12 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3478 sketch

]]>
Final Project – Columbina’s Companion https://courses.ideate.cmu.edu/16-223/f2014/final-project-columbinas-companion/ Thu, 11 Dec 2014 07:29:39 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3424 Group Members

Akiva Krauthamer – Tutor

Ruben Markowitz – Designer and Scribe

Bryan Gardiner – Integrator

Introduction

Most people in the world are fairly familiar with the concept of an actor: a person who stands on the stage and delivers a story in some form. The same group of people is typically familiar with the concept of a robot: a mechanical object that performs a task autonomously. In both of these cases, there is a certain set of rules regarding appearence, movement, tasks, shape, scale, and so on that are generally used to define these objects. In more recent years, the theatre community has begun to accept robots into the theatrical setting, however these adaptations are rarely seamless. In many cases, the actors act as we expect actors to act, and the robot behaves like a robot should. But what happens when we attempt to bring a robot into a theater as an actor? Can it still look like a robot? Can it act like a robot? Can the actors interact with it like a machine?

Columbina’s Companion was an experiment in how to seamlessly integrate a robot into a show. We attempted to merge certain aspects of a classical robot with certain aspects of a classical actor to create a true “robot actor”.

Several ideas for non-robotic form.

Several ideas for non-robotic form.

progress2

Video

Below is a video of Columbina’s Companion’s debut performance

 

Technical Notes

The base of the robot is a Brookestone Rover 2.0

2014-12-08 19.56.30

This allowed us easy mobility of the robot. The tank drive gave the robot the ability to carry weight and move in a more organic way. It also provided a wireless (WiFi) platform, allowing the robot to roam freely around the theater. The mobility of the robot is human controlled, rather than autonomous. This is similar to an actor; the puppeteer (the director) can tell robot (actor) where to go in the physical space.

2014-12-08 19.56.22

The arms of the robot are two 24″ bendy rulers attached to servos so they could bend at will and independently. These arms are one of two main expressive components of the robot. They were also controlled wirelessly, Arduino to Arduino using NRF2L01+ chips, and were controlled by the puppeteer. Future versions may allow this to be autonomous in response to some sort of stimulous. This is similar to an actor developing his or her own emotional responses to the action on stage.

2014-12-08 18.18.41

The lights on the side of the robot are tied in with the arms, but may have similar autonomy in the future.

shell

The shell we created was also a significant design factor. We decided to make a geodesic dome out of paper for the shell. The many facets and faces of this shape, as well as the multidirectionality of it created a mistique about the robot, and geodesica are not a common shape for traditional robots; it is about as far away from traditional humanoid and sci-fi as you can get.

Wiring Diagram

diagram

CODE

All of the Arduino code was stock NRF2L01 code for arduino (sending and receiving). They were communicating via a numerical string value: the numbers 0-180 specified a value on one arm servo, 1000-1180 specified degree on the other servo. Stock NRF2L01 code can be found here.

The control was based on a piece of Python software called RoverPylot. The changes came in the main code <“ps3Rover.py”>. We changed the button configuration to listen to the Microsoft XBox controller we had available. We then changed to code to send the values (the strings of numerical numbers that we wanted) to the treads, and added code that mapped the toggle button values to the servos over the Arduinos.

The final Python code (requires OpenCV and PyGame) are here:

#!/usr/bin/env python

”’
whimsybotv2.py was edited by Ruben Markowitz, Bryan Gardiner, and Akiva Krauthamer.

ps3rover.py Drive the Brookstone Rover 2.0 via the P3 Controller, displaying
the streaming video using OpenCV.

Copyright (C) 2014 Simon D. Levy

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
”’

# You may want to adjust these buttons for your own controller
BUTTON_GDRIVE = 8 # Select button toggle G-Drive
BUTTON_QUIT = 9 # Start button quits
BUTTON_LIGHTS = 0 # Square button toggles lights
BUTTON_INFRARED = 2 # Circle button toggles infrared
BUTTON_CAMERA_UP = 3 # Triangle button raises camera
BUTTON_CAMERA_DOWN = 1 # X button lowers camera
SERIAL_PORT = 8 # Arduino Com Port
# Avoid button bounce by enforcing lag between button events
MIN_BUTTON_LAG_SEC = 0.5

# Avoid close-to-zero values on axis
MIN_AXIS_ABSVAL = 0.1
import rover
import cvutils
import time
import pygame
import sys
import signal
import serial

def _signal_handler(signal, frame):
frame.f_locals[‘rover’].close()
sys.exit(0)

serialSender = serial.Serial(‘COM5’, 57600)

# Try to start OpenCV for video
try:
import cv
except:
cv = None

# Handler passed to Rover constructor
class PS3Rover(rover.Rover):

“””def processVideo(self, jpegbytes):

try:

if cv:

image = cvutils.jpegbytes_to_cvimage(jpegbytes)
wname = ‘Rover 2.0’
cv.NamedWindow(wname, cv.CV_WINDOW_AUTOSIZE )
cv.ShowImage(wname, image )
cv.WaitKey(5)

else:
pass

except:

pass

“””
# Converts Y coordinate of specified axis to +/-1 or 0
def _axis(index):

value = -controller.get_axis(index)

if value > MIN_AXIS_ABSVAL:
return value
elif value < -MIN_AXIS_ABSVAL:
return value
else:
return 0
# Handles button bounce by waiting a specified time between button presses
“””def _checkButton(controller, lastButtonTime, flag, buttonID, \
onRoutine=None, offRoutine=None):
if controller.get_button(buttonID):
if (time.time() – lastButtonTime) > MIN_BUTTON_LAG_SEC:
lastButtonTime = time.time()
if flag:
if offRoutine:
offRoutine()
flag = False
else:
if onRoutine:
onRoutine()
flag = True
return lastButtonTime, flag”””

# Set up controller using PyGame
pygame.display.init()
pygame.joystick.init()
controller = pygame.joystick.Joystick(0)
controller.init()

# Create a PS3 Rover object
rover = PS3Rover()

# Defaults on startup: lights off, ordinary camera
lightsAreOn = False
infraredIsOn = False

# Tracks button-press times for debouncing
lastButtonTime = 0

# Set up signal handler for CTRL-C
signal.signal(signal.SIGINT, _signal_handler)

# Loop till Quit hit
while True:

# Force joystick polling
pygame.event.pump()

“”” # Quit on Start button
if controller.get_button(BUTTON_QUIT):
break

# Toggle lights
lastButtonTime, lightsAreOn = \
_checkButton(controller, lastButtonTime, \
lightsAreOn, BUTTON_LIGHTS, rover.turnLightsOn, rover.turnLightsOff)

# Toggle night vision (infrared camera)
lastButtonTime, infraredIsOn = \
_checkButton(controller, lastButtonTime, \
infraredIsOn, BUTTON_INFRARED, rover.turnInfraredOn, rover.turnInfraredOff)

# Move camera up/down
if controller.get_button(BUTTON_CAMERA_UP):
rover.moveCamera(1)
elif controller.get_button(BUTTON_CAMERA_DOWN):
rover.moveCamera(-1)
else:
rover.moveCamera(0)
“””
# Set treads based on axes
rover.setTreads(_axis(1), _axis(3))
serialSender.write(str(int(abs(_axis(4))*180))+”\n”)
time.sleep(.005)
serialSender.write(str(((180-int(abs(_axis(0))*180))+1000))+”\n”)
time.sleep(.005)

# Shut down Rover
rover.close()

The rest of the RoverPylot library can be downloaded from the above link.

The Send Arduino code used is as follows:

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

//

// Hardware configuration: first MSP430, then ATMega

//

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

unsigned long integerValue;

unsigned long incomingByte;

//

// Topology

//

// Radio pipe addresses for the 2 nodes to communicate.

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

void setup(void)

{

Serial.begin(BAUD);

printf_begin();

//

// Setup and configure rf radio

//

radio.begin();

// This simple sketch opens two pipes for these two nodes to communicate

// back and forth.

// Open ‘our’ pipe for writing

// Open the ‘other’ pipe for reading, in position #1 (we can have up to 5 pipes open for reading)

radio.openWritingPipe(pipes[0]);

radio.openReadingPipe(1,pipes[1]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

//

// Dump the configuration of the rf unit for debugging

//

radio.printDetails();

}

void loop(void)

{

//

// Ping out role. Repeatedly send the current time

//

// First, stop listening so we can talk.

radio.stopListening();

if (Serial.available() > 0) { // something came across serial

integerValue = 0; // throw away previous integerValue

while(1) { // force into a loop until ‘n’ is received

incomingByte = Serial.read();

if (incomingByte == ‘\n’) break; // exit the while(1), we’re done receiving

if (incomingByte == -1) continue; // if no characters are in the buffer read() returns -1

integerValue *= 10; // shift left 1 decimal place

// convert ASCII to integer, add, and shift left 1 decimal place

integerValue = ((incomingByte – 48) + integerValue);

}

}

radio.write( &integerValue, sizeof(unsigned long) );

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

And the receiving Arduino code (the one on the robot itself):

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

#include <Servo.h>

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

Servo myServo1;

Servo myServo2;

unsigned int leftLight=0;

unsigned int rightLight=0;

void setup(void)

{

myServo1.attach(2);

myServo1.write(0);

myServo2.attach(3);

myServo2.write(0);

Serial.begin(BAUD);

printf_begin();

printf(“RF24/examples/pingpair/\n\r”);

radio.begin();

radio.openWritingPipe(pipes[1]);

radio.openReadingPipe(1,pipes[0]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

radio.printDetails();

}

void loop(void)

{

if ( radio.available() )

{

// Dump the payloads until we’ve gotten everything

unsigned long got_time;

bool done = false;

while (!done)

{

// Fetch the payload, and see if this was the last one.

done = radio.read( &got_time, sizeof(unsigned long) );

}

Serial.println(got_time,DEC);

// First, stop listening so we can talk

radio.stopListening();

// Send the final one back. This way, we don’t delay

// the reply while we wait on serial i/o.

if(got_time<181)

{

if(got_time==0)

myServo1.write(1);

if(got_time==180)

myServo1.write(179);

else

myServo1.write(got_time);

analogWrite(5,(got_time*255)/180);

}

if(got_time>999 && got_time<1181)

{

if(got_time==1000)

myServo1.write(1);

if(got_time==1180)

myServo1.write(179);

else

myServo2.write(got_time-1000);

analogWrite(6,((got_time-1000)*255)/180);

}

// Now, resume listening so we catch the next packets.

radio.startListening();

}

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

 

]]>
Final Project – UltraSonic https://courses.ideate.cmu.edu/16-223/f2014/final-project-ultrasonic/ Thu, 11 Dec 2014 05:03:24 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3411 Nkinde Ambalo, Horace Hou

Introduction

For our project, Ultrasonic, we created a device that tries to quantify a sense that we, humans, can’t sense. Using a microphone and a PureData patch we created a device that can sense ultrasonic sound waves in the range of 18kHz to 20kHz. Using this information we are able to tone the frequency of the sound waves to what humans are able to perceive. Also the strength of the signal changes the volume of the sound outputted. Using this information we also wanted to be able to map the what the variety and sources of ultrasonic waves that exist in a persons environment.

Technical Aspects

Our project consisted of a raspberry pi running pure data. The patch running on pure data watched for the plugged in microphone input level to go higher than the usual for background noise. When it did it played sound through the connected output device, sound which was pitched down 6 octaves from the input sound. This allowed anyone to hear sounds that could have been above their threshold of hearing, as the high frequency tone would be below 10,000 hz. The raspberry pi had to have an external usb sound card connected to it so that it could accept both a recording device as well as output to a speaker device. Other than power, no other cables were connected to the Raspberry Pi. In the plan for this device there was also the plans to add a gps device connected to the pins of the raspberry pi, or a bluetooth adapter connected to a smartphone pinging the location of the phone every time high frequency sound was detected.

Photos

20141210_232619

20141210_232534

20141210_232512

Video

]]>