Graded Projects – Physical Computing https://courses.ideate.cmu.edu/16-223/f2014 Carnegie Mellon University, IDeATe Fri, 11 Aug 2017 21:41:33 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 Final Project – Trio of Drawing Bots https://courses.ideate.cmu.edu/16-223/f2014/trio-of-drawing-bots/ Sat, 13 Dec 2014 12:10:34 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3504 Group: Claire HentschkerIntroduction

I wanted to create another iteration of the previous drawing robot, this time with three smaller robots that would not only react to their own lines, but also the lines created by the other robots. This interaction would augment the drawings created based on the movement of all three robots in space, and the duration of time they have been running for. The more lines there are, the more frantic the drawing become. 

Technical Notes

I used one gearhead motor, used with the DRV8833 motor driver on the light blue beans pwm pins to control direction. This allowed me to wirelessly control the movement of the robots. I also used the QTR-1RC reflectance sensor to check whether the bot passes over a place it has already drawn. I used Rhino to model the box and the arm and colorful acrylic for the parts. A shaft collars on the hinge of the arm allowed for the rotation and a screw held the arm onto the motor.

Schematic and Code

IMG.png

int MB1 = 6; // motor b pin 1
int MB2 = 5; // motor b pin 2
int sensor1 = 2; // change “0” to whatever pin to which you are hooking the sensor. Just this one change allows you to test operation on other pins.
int reflectance;
int arc_size = 10 ////but this should come from pure data…10 is our small arc size, 100 could be our max?? this can all be set in in the pc.scale thing. where 10 is the second to last number and 100 is the last number

void setup() {
pinMode(MB1, OUTPUT);
pinMode(MB2, OUTPUT);
}
void loop() {
reflectance = 1; //initialize value to 1 at the beginning of each loop
pinMode(sensor1, OUTPUT); //set pin as output
digitalWrite(sensor1, HIGH); //set pin HIGH (5V)
delayMicroseconds(15); //charge capacitor for 15 microseconds

pinMode(sensor1, INPUT); //set pin as input
while((reflectance < 900) && (digitalRead(sensor1) != LOW)){ //timeout at 500
// read the pin state, increment counter until state = LOW
++ reflectance; // increment value to be displayed via serial port
// delayMicroseconds(4); //Change value or comment out to adjust value range
}

if (reflectance < 500){
Serial.println(reflectance);} //Send reflectance value to serial display
else {
Serial.println(“T.O.”); //if reflectance value is over 500 then it’s a “timeout”
}

delay(0);
Serial.begin(9600);
doForward(MB1, MB2); // motor B forward
delay(arc_size);
doStop(MB1, MB2);
delay(0);

if (reflectance > 200) {
doBackward(MB1, MB2); //motor B backward
delay(arc_size);
doStop(MB1, MB2);
delay(0);

}
}
void doForward(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, HIGH);
}
void doStop(int pin1, int pin2) {
digitalWrite(pin2, LOW);
digitalWrite(pin1, LOW);
}
void doBackward(int pin1, int pin2) {
digitalWrite(pin2, HIGH);
digitalWrite(pin1, LOW);
}



]]>
Final Project – Tech Tunnel Vision https://courses.ideate.cmu.edu/16-223/f2014/final-project-tech-tunnel-vision/ Thu, 11 Dec 2014 21:02:27 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3476 Introduction 

Rachel Ciavarella & Joe Mallonee

People love technology, and people love hearing about how successful people got to where they are: a perfect inspirational match is found in successful technology leaders. Everyone’s journey and path is different, but it can be incredibly tempting to try to follow the advice and direct life experiences of successful people. We wanted to embody this is by playing with the relationship between the audience, the participant, and people who have achieved a perceived level of immense success. We saw Bill Gates, Elon Musk, and Mark Zuckerberg as modern day oracles of tech non-sense, and coupled that with the tunnel vision Millenials have towards “tech.”

Our project was designed for a dark and quiet room. We planned to have a group of people walk in and see a singular, floating, telescopic helmet glaring directly into a wall. A lone participant would ascend the steps, and peer into a future as told by Bill, Elon, and Mark. When they entered the helmet their Newsfeed (a generic one for the purposes of our demonstration) would begin to scroll, illuminating the audience while the participant was unaware. We wanted the audience to be confused about whether something was supposed to be happening or not, to wonder what the singular person was seeing, and in all honesty be underwhelmed and somewhat unwilling participants. For the individual in the helmet we wanted them to feel confused as well, to struggle to decipher what was being said by the three people in a humorous way. We aimed to place speeches of famous tech figures in a different context so they could experience the detectable absurdity in their stories at least in relationship to the viewer’s life.

Technology 

We began the project by using facial recognition based on OpenCV and run through Processing to assess the viewer’s focus: the more directly they studied the screen the faster the Newsfeed outside would scroll. The audio would begin to warp if the viewer was not intensely centered towards the screen. Due to time constraints we decided to simplify this part of the experience, and used an IR sensor to detect the presence of the viewer in the helmet and begin playing the video. We added an additional IR sensor to control the floor projection at the same time.

The helmet itself is constructed over a bike helmet with sheets of styrene. Because of the weight and alignment with the projection, we suspended the helmet with a series of ropes. We ended up using two computers, two Arduinos, and two Processing sketches. If we created a second iteration we would condense all of this into one computer, one Arduino, and one Processing sketch. We’d also build custom steps, and find a way to support the helmet in a less visible and subtle way. We would also insist on the ideal room and conditions for our projects.


Content

tech_tunnelvision_schem

 

IR_Helmet On_Off_Processing

 

]]>
Final Project Sketch – Tech Tunnel vision https://courses.ideate.cmu.edu/16-223/f2014/final-project-sketch-tech-tunnel-vision/ Thu, 11 Dec 2014 20:59:12 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3478 sketch

]]>
Final Project – Columbina’s Companion https://courses.ideate.cmu.edu/16-223/f2014/final-project-columbinas-companion/ Thu, 11 Dec 2014 07:29:39 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3424 Group Members

Akiva Krauthamer – Tutor

Ruben Markowitz – Designer and Scribe

Bryan Gardiner – Integrator

Introduction

Most people in the world are fairly familiar with the concept of an actor: a person who stands on the stage and delivers a story in some form. The same group of people is typically familiar with the concept of a robot: a mechanical object that performs a task autonomously. In both of these cases, there is a certain set of rules regarding appearence, movement, tasks, shape, scale, and so on that are generally used to define these objects. In more recent years, the theatre community has begun to accept robots into the theatrical setting, however these adaptations are rarely seamless. In many cases, the actors act as we expect actors to act, and the robot behaves like a robot should. But what happens when we attempt to bring a robot into a theater as an actor? Can it still look like a robot? Can it act like a robot? Can the actors interact with it like a machine?

Columbina’s Companion was an experiment in how to seamlessly integrate a robot into a show. We attempted to merge certain aspects of a classical robot with certain aspects of a classical actor to create a true “robot actor”.

Several ideas for non-robotic form.

Several ideas for non-robotic form.

progress2

Video

Below is a video of Columbina’s Companion’s debut performance

 

Technical Notes

The base of the robot is a Brookestone Rover 2.0

2014-12-08 19.56.30

This allowed us easy mobility of the robot. The tank drive gave the robot the ability to carry weight and move in a more organic way. It also provided a wireless (WiFi) platform, allowing the robot to roam freely around the theater. The mobility of the robot is human controlled, rather than autonomous. This is similar to an actor; the puppeteer (the director) can tell robot (actor) where to go in the physical space.

2014-12-08 19.56.22

The arms of the robot are two 24″ bendy rulers attached to servos so they could bend at will and independently. These arms are one of two main expressive components of the robot. They were also controlled wirelessly, Arduino to Arduino using NRF2L01+ chips, and were controlled by the puppeteer. Future versions may allow this to be autonomous in response to some sort of stimulous. This is similar to an actor developing his or her own emotional responses to the action on stage.

2014-12-08 18.18.41

The lights on the side of the robot are tied in with the arms, but may have similar autonomy in the future.

shell

The shell we created was also a significant design factor. We decided to make a geodesic dome out of paper for the shell. The many facets and faces of this shape, as well as the multidirectionality of it created a mistique about the robot, and geodesica are not a common shape for traditional robots; it is about as far away from traditional humanoid and sci-fi as you can get.

Wiring Diagram

diagram

CODE

All of the Arduino code was stock NRF2L01 code for arduino (sending and receiving). They were communicating via a numerical string value: the numbers 0-180 specified a value on one arm servo, 1000-1180 specified degree on the other servo. Stock NRF2L01 code can be found here.

The control was based on a piece of Python software called RoverPylot. The changes came in the main code <“ps3Rover.py”>. We changed the button configuration to listen to the Microsoft XBox controller we had available. We then changed to code to send the values (the strings of numerical numbers that we wanted) to the treads, and added code that mapped the toggle button values to the servos over the Arduinos.

The final Python code (requires OpenCV and PyGame) are here:

#!/usr/bin/env python

”’
whimsybotv2.py was edited by Ruben Markowitz, Bryan Gardiner, and Akiva Krauthamer.

ps3rover.py Drive the Brookstone Rover 2.0 via the P3 Controller, displaying
the streaming video using OpenCV.

Copyright (C) 2014 Simon D. Levy

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Lesser General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
”’

# You may want to adjust these buttons for your own controller
BUTTON_GDRIVE = 8 # Select button toggle G-Drive
BUTTON_QUIT = 9 # Start button quits
BUTTON_LIGHTS = 0 # Square button toggles lights
BUTTON_INFRARED = 2 # Circle button toggles infrared
BUTTON_CAMERA_UP = 3 # Triangle button raises camera
BUTTON_CAMERA_DOWN = 1 # X button lowers camera
SERIAL_PORT = 8 # Arduino Com Port
# Avoid button bounce by enforcing lag between button events
MIN_BUTTON_LAG_SEC = 0.5

# Avoid close-to-zero values on axis
MIN_AXIS_ABSVAL = 0.1
import rover
import cvutils
import time
import pygame
import sys
import signal
import serial

def _signal_handler(signal, frame):
frame.f_locals[‘rover’].close()
sys.exit(0)

serialSender = serial.Serial(‘COM5’, 57600)

# Try to start OpenCV for video
try:
import cv
except:
cv = None

# Handler passed to Rover constructor
class PS3Rover(rover.Rover):

“””def processVideo(self, jpegbytes):

try:

if cv:

image = cvutils.jpegbytes_to_cvimage(jpegbytes)
wname = ‘Rover 2.0’
cv.NamedWindow(wname, cv.CV_WINDOW_AUTOSIZE )
cv.ShowImage(wname, image )
cv.WaitKey(5)

else:
pass

except:

pass

“””
# Converts Y coordinate of specified axis to +/-1 or 0
def _axis(index):

value = -controller.get_axis(index)

if value > MIN_AXIS_ABSVAL:
return value
elif value < -MIN_AXIS_ABSVAL:
return value
else:
return 0
# Handles button bounce by waiting a specified time between button presses
“””def _checkButton(controller, lastButtonTime, flag, buttonID, \
onRoutine=None, offRoutine=None):
if controller.get_button(buttonID):
if (time.time() – lastButtonTime) > MIN_BUTTON_LAG_SEC:
lastButtonTime = time.time()
if flag:
if offRoutine:
offRoutine()
flag = False
else:
if onRoutine:
onRoutine()
flag = True
return lastButtonTime, flag”””

# Set up controller using PyGame
pygame.display.init()
pygame.joystick.init()
controller = pygame.joystick.Joystick(0)
controller.init()

# Create a PS3 Rover object
rover = PS3Rover()

# Defaults on startup: lights off, ordinary camera
lightsAreOn = False
infraredIsOn = False

# Tracks button-press times for debouncing
lastButtonTime = 0

# Set up signal handler for CTRL-C
signal.signal(signal.SIGINT, _signal_handler)

# Loop till Quit hit
while True:

# Force joystick polling
pygame.event.pump()

“”” # Quit on Start button
if controller.get_button(BUTTON_QUIT):
break

# Toggle lights
lastButtonTime, lightsAreOn = \
_checkButton(controller, lastButtonTime, \
lightsAreOn, BUTTON_LIGHTS, rover.turnLightsOn, rover.turnLightsOff)

# Toggle night vision (infrared camera)
lastButtonTime, infraredIsOn = \
_checkButton(controller, lastButtonTime, \
infraredIsOn, BUTTON_INFRARED, rover.turnInfraredOn, rover.turnInfraredOff)

# Move camera up/down
if controller.get_button(BUTTON_CAMERA_UP):
rover.moveCamera(1)
elif controller.get_button(BUTTON_CAMERA_DOWN):
rover.moveCamera(-1)
else:
rover.moveCamera(0)
“””
# Set treads based on axes
rover.setTreads(_axis(1), _axis(3))
serialSender.write(str(int(abs(_axis(4))*180))+”\n”)
time.sleep(.005)
serialSender.write(str(((180-int(abs(_axis(0))*180))+1000))+”\n”)
time.sleep(.005)

# Shut down Rover
rover.close()

The rest of the RoverPylot library can be downloaded from the above link.

The Send Arduino code used is as follows:

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

//

// Hardware configuration: first MSP430, then ATMega

//

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

unsigned long integerValue;

unsigned long incomingByte;

//

// Topology

//

// Radio pipe addresses for the 2 nodes to communicate.

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

void setup(void)

{

Serial.begin(BAUD);

printf_begin();

//

// Setup and configure rf radio

//

radio.begin();

// This simple sketch opens two pipes for these two nodes to communicate

// back and forth.

// Open ‘our’ pipe for writing

// Open the ‘other’ pipe for reading, in position #1 (we can have up to 5 pipes open for reading)

radio.openWritingPipe(pipes[0]);

radio.openReadingPipe(1,pipes[1]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

//

// Dump the configuration of the rf unit for debugging

//

radio.printDetails();

}

void loop(void)

{

//

// Ping out role. Repeatedly send the current time

//

// First, stop listening so we can talk.

radio.stopListening();

if (Serial.available() > 0) { // something came across serial

integerValue = 0; // throw away previous integerValue

while(1) { // force into a loop until ‘n’ is received

incomingByte = Serial.read();

if (incomingByte == ‘\n’) break; // exit the while(1), we’re done receiving

if (incomingByte == -1) continue; // if no characters are in the buffer read() returns -1

integerValue *= 10; // shift left 1 decimal place

// convert ASCII to integer, add, and shift left 1 decimal place

integerValue = ((incomingByte – 48) + integerValue);

}

}

radio.write( &integerValue, sizeof(unsigned long) );

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

And the receiving Arduino code (the one on the robot itself):

#include <SPI.h>

#include <RF24.h>

#include “printf.h”

#include <Servo.h>

#if defined(ENERGIA)

#if defined(__MSP430FR5739__)

# define CE P1_2

# define CS P1_3

# define ROLE P2_5

#elif defined(__MSP430G2553__)

# define CE P2_1

# define CS P2_0

# define ROLE P2_2

//#elif defined(__LM4F120H5QR__)

//# define CE PA_6

//# define CS PB_5

//# define ROLE PA_5

#endif

# define BAUD 9600

#else

# define CE 9

# define CS 10

# define ROLE 7

# define BAUD 57600

#endif

RF24 radio(CE, CS);

const uint64_t pipes[2] = { 0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

Servo myServo1;

Servo myServo2;

unsigned int leftLight=0;

unsigned int rightLight=0;

void setup(void)

{

myServo1.attach(2);

myServo1.write(0);

myServo2.attach(3);

myServo2.write(0);

Serial.begin(BAUD);

printf_begin();

printf(“RF24/examples/pingpair/\n\r”);

radio.begin();

radio.openWritingPipe(pipes[1]);

radio.openReadingPipe(1,pipes[0]);

radio.enableDynamicPayloads() ;

radio.setAutoAck( true ) ;

radio.powerUp() ;

radio.startListening();

radio.printDetails();

}

void loop(void)

{

if ( radio.available() )

{

// Dump the payloads until we’ve gotten everything

unsigned long got_time;

bool done = false;

while (!done)

{

// Fetch the payload, and see if this was the last one.

done = radio.read( &got_time, sizeof(unsigned long) );

}

Serial.println(got_time,DEC);

// First, stop listening so we can talk

radio.stopListening();

// Send the final one back. This way, we don’t delay

// the reply while we wait on serial i/o.

if(got_time<181)

{

if(got_time==0)

myServo1.write(1);

if(got_time==180)

myServo1.write(179);

else

myServo1.write(got_time);

analogWrite(5,(got_time*255)/180);

}

if(got_time>999 && got_time<1181)

{

if(got_time==1000)

myServo1.write(1);

if(got_time==1180)

myServo1.write(179);

else

myServo2.write(got_time-1000);

analogWrite(6,((got_time-1000)*255)/180);

}

// Now, resume listening so we catch the next packets.

radio.startListening();

}

}

// vim:cin:ai:sts=2 sw=2 ft=cpp

 

]]>
Final Project – UltraSonic https://courses.ideate.cmu.edu/16-223/f2014/final-project-ultrasonic/ Thu, 11 Dec 2014 05:03:24 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3411 Nkinde Ambalo, Horace Hou

Introduction

For our project, Ultrasonic, we created a device that tries to quantify a sense that we, humans, can’t sense. Using a microphone and a PureData patch we created a device that can sense ultrasonic sound waves in the range of 18kHz to 20kHz. Using this information we are able to tone the frequency of the sound waves to what humans are able to perceive. Also the strength of the signal changes the volume of the sound outputted. Using this information we also wanted to be able to map the what the variety and sources of ultrasonic waves that exist in a persons environment.

Technical Aspects

Our project consisted of a raspberry pi running pure data. The patch running on pure data watched for the plugged in microphone input level to go higher than the usual for background noise. When it did it played sound through the connected output device, sound which was pitched down 6 octaves from the input sound. This allowed anyone to hear sounds that could have been above their threshold of hearing, as the high frequency tone would be below 10,000 hz. The raspberry pi had to have an external usb sound card connected to it so that it could accept both a recording device as well as output to a speaker device. Other than power, no other cables were connected to the Raspberry Pi. In the plan for this device there was also the plans to add a gps device connected to the pins of the raspberry pi, or a bluetooth adapter connected to a smartphone pinging the location of the phone every time high frequency sound was detected.

Photos

20141210_232619

20141210_232534

20141210_232512

Video

]]>
Final Project – Acrobotic https://courses.ideate.cmu.edu/16-223/f2014/acrobotic/ Thu, 11 Dec 2014 04:39:17 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3277 Group Members: Brian Yang and Luke Hottinger
Roles: Brian Yang and Luke Hottinger as Scribes, Designers, Integrators, and Tutors

Introduction

Acrobotic gives you the power of telekinesis as you control a full size acrobatic robot using your own body movements.  The robot tracks your movements using a Microsoft Kinect v2 and then uses that data to dance and sway with you.  With specific hand gestures and the sway of your body, you can control how Acrobotic moves.  You can try to rotate it as fast as you can or try to balance it – its your choice.  The best part: you can do all of this without even touching the robot.

Our previous flipping robots had a very interesting behavior that was very difficult to interact with.  As it flipped you were compelled to touch it and play with it, however, it proved difficult as there was no real way to.  With Acrobotic, you are able to interact with this intriguing motion without actually touching it.

Video

Technical Notes

Acrobotic is constructed out of beams of 80/20 extruded aluminum and stands roughly four and a half feet tall at rest.  A smaller robot is mounted on one of its arms and is able to rotate and flip independently of the rest of the structure.  The main structure’s rotation is controlled by a mass of plate steel that is able to move towards and away from the axis of rotation.  As the mass passes the center of balance, it is able to move in either direction.

The structure’s motion is controlled wirelessly from a computer with a Kinect sensor.  The Kinect tracks the Y coordinate of the person’s shoulders and is able to infer the level of the person’s sway.  Holding two fingers on your left hand extends the mass away from the center while two fingers held up on the right hand pulls it in closer to the center.  Two arduinos are required – one for each independent body of rotation.  The wireless data is streamed from the Kinect through a computer to two Wixel modules – one for each axis of rotation.

Images

IMG_0067 IMG_0068 IMG_0069 IMG_0071

View Our Previous Iterations

Schematic and Code

Independent Rotating Robot

Physical Computing Autonomous Robot SchematicKinect Source Code

Independent Rotating Robot Arduino Source

Main Arm Arduino Source

]]>
Final Project – Mood Sweater https://courses.ideate.cmu.edu/16-223/f2014/mood-sweater/ Thu, 11 Dec 2014 04:32:08 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3392 MOOD SWEATER

Group members : Alice Borie and Riya Savla

Introduction

Our mood sweater aims to visualize the wearer’s emotions.  We created this sweater because we realized there were times when emotion is something that cannot be truly seen. For example, it can be easy to mask and hide our true emotions. In other situations, individuals may also be unable to expressively show emotion because of physical impairments such as muscle degeneration. Our sweater solves these problems by detecting emotion based off biometrics and by displaying them through color.

 

 

Technical Details

The sweater works based on inputs of a pulse sensor and a Galvanic Skin Response (GSR) sensor to detect different emotions. The GSR provides data about skin conductance and maps micro perspiration levels to a more active or passive emotion. The GSR works on the simple principle that the wearer’s skin completes the path of the circuit. The voltage difference across the two end of the GSR is measured and values in different ranges correspond to different states of emotion.

The Pulse Sensor, as the name implies, measures heart rate. We used example code available online, to get BPM values and infer emotional state accordingly. We used an Arduino UNO to process the data and control the LED display.

Currently, we are able to distinguish four different emotions – happiness, sadness, anger and a neutral state. Given more time, we would have liked to include temperature sensing of different parts of the body to create a heat map and widen the range of emotions the sweater can detect.

 

Making the product

We packaged the circuitry neatly, added a switch so the user has more control over whether he wants his emotions to be on display or not and stitched the circuit on to a jacket. The sensors (GSR and Pulse) come out through the jacket’s right sleeve for the user to slip his fingers through them. The LED display sits between the shoulder and the chest and the main circuit sits in the jacket’s pocket.

Arduino Code

//  VARIABLES
// Heart Rate
int pulsePin = 0;                 // Pulse Sensor purple wire connected to analog pin 0
int blinkPin = 13;                // pin to blink led at each beat
int fadePin = 5;                  // pin to do fancy classy fading blink at each beat
int fadeRate = 0;                 // used to fade LED on with PWM on fadePin
// GSR
int redPin = 9;
int greenPin = 10;
int bluePin = 6;
int potPin = 1;
int sensorPin = 2;
long red = 0xFF0000;
long green = 0x00FF00;
long blue = 0x000080;
long white = 0xFFFFFF;
int band = 20;
// these variables are volatile because they are used during the interrupt service routine!
volatile int BPM;                   // used to hold the pulse rate
volatile int Signal;                // holds the incoming raw data
volatile int IBI = 600;             // holds the time between beats, must be seeded!
volatile boolean Pulse = false;     // true when pulse wave is high, false when it’s low
volatile boolean QS = false;        // becomes true when Arduoino finds a beat.
volatile int rate[10];                    // array to hold last ten IBI values
volatile unsigned long sampleCounter = 0;          // used to determine pulse timing
volatile unsigned long lastBeatTime = 0;           // used to find IBI
volatile int P =512;                      // used to find peak in pulse wave, seeded
volatile int T = 512;                     // used to find trough in pulse wave, seeded
volatile int thresh = 525;                // used to find instant moment of heart beat, seeded
volatile int amp = 100;                   // used to hold amplitude of pulse waveform, seeded
volatile boolean firstBeat = true;        // used to seed rate array so we startup with reasonable BPM
volatile boolean secondBeat = false;      // used to seed rate array so we startup with reasonable BPM
void setup(){
  pinMode(blinkPin,OUTPUT);         // pin that will blink to your heartbeat!
  pinMode(fadePin,OUTPUT);          // pin that will fade to your heartbeat!
  Serial.begin(9600);             // we agree to talk fast!
  interruptSetup();                 // sets up to read Pulse Sensor signal every 2mS
   // UN-COMMENT THE NEXT LINE IF YOU ARE POWERING The Pulse Sensor AT LOW VOLTAGE,
   // AND APPLY THAT VOLTAGE TO THE A-REF PIN
//   analogReference(EXTERNAL);
        pinMode(potPin, INPUT);  //This is to set the input resistance that comes from potentiometer A1
pinMode(sensorPin, INPUT);//This is to set the input resistance from the skin
pinMode(redPin, OUTPUT);//This is to set the output for RED Led
pinMode(greenPin, OUTPUT);//This is to set the output for green Led
pinMode(bluePin, OUTPUT);//This is to set the output for blue(or transparent) Led
}
void loop(){
  Serial.println(BPM);
  int gsr = analogRead(sensorPin);
  int pot = analogRead(potPin);
  boolean GSRhigh = (gsr > pot + band);
  boolean angry = (BPM > 100);
  boolean sad = ((BPM > 80) && (BPM < 100) && !GSRhigh);
  boolean happy = (GSRhigh && (BPM > 80) && (BPM < 100));
if (angry)//This condition if true indicates the lie
{
                Serial.println(“angry”);
                Serial.print(“GSR = “);
                Serial.println(gsr);
setColor(red);
}
        /*
else if (gsr < pot – band)
//This condition if true indicates the need of adjusting the resistance
{
setColor(blue);
}*/
else if (sad)//This condition becomes  true for other condition becomes false and is at normal resistance
{
                Serial.println(“sad”);
                Serial.print(“GSR = “);
                Serial.println(gsr);
setColor(blue);
}
else if (happy)//This condition becomes  true for other condition becomes false and is at normal resistance
{
                Serial.println(“happy”);
                Serial.println(gsr);
setColor(green);
}
else //This condition becomes  true for other condition becomes false and is at normal resistance
{
                Serial.println(“nuetral”);
                Serial.print(“GSR = “);
                Serial.println(gsr);
setColor(white);
}
}
void setColor(long rgb) //This functions sets the colour
{
int red = rgb >> 16;
int green = (rgb >> 8) & 0xFF;
int blue = rgb & 0xFF;
analogWrite(redPin, 255 – red);
analogWrite(greenPin, 255 – green);
analogWrite(bluePin, 255 – blue);
}
void interruptSetup(){
  // Initializes Timer2 to throw an interrupt every 2mS.
  TCCR2A = 0x02;     // DISABLE PWM ON DIGITAL PINS 3 AND 11, AND GO INTO CTC MODE
  TCCR2B = 0x06;     // DON’T FORCE COMPARE, 256 PRESCALER
  OCR2A = 0X7C;      // SET THE TOP OF THE COUNT TO 124 FOR 500Hz SAMPLE RATE
  TIMSK2 = 0x02;     // ENABLE INTERRUPT ON MATCH BETWEEN TIMER2 AND OCR2A
  sei();             // MAKE SURE GLOBAL INTERRUPTS ARE ENABLED
}
// THIS IS THE TIMER 2 INTERRUPT SERVICE ROUTINE.
// Timer 2 makes sure that we take a reading every 2 miliseconds
ISR(TIMER2_COMPA_vect){                         // triggered when Timer2 counts to 124
  cli();                                      // disable interrupts while we do this
  Signal = analogRead(pulsePin);              // read the Pulse Sensor
  sampleCounter += 2;                         // keep track of the time in mS with this variable
  int N = sampleCounter – lastBeatTime;       // monitor the time since the last beat to avoid noise
    //  find the peak and trough of the pulse wave
  if(Signal < thresh && N > (IBI/5)*3){       // avoid dichrotic noise by waiting 3/5 of last IBI
    if (Signal < T){                        // T is the trough
      T = Signal;                         // keep track of lowest point in pulse wave
    }
  }
  if(Signal > thresh && Signal > P){          // thresh condition helps avoid noise
    P = Signal;                             // P is the peak
  }                                        // keep track of highest point in pulse wave
  //  NOW IT’S TIME TO LOOK FOR THE HEART BEAT
  // signal surges up in value every time there is a pulse
  if (N > 250){                                   // avoid high frequency noise
    if ( (Signal > thresh) && (Pulse == false) && (N > (IBI/5)*3) ){
      Pulse = true;                               // set the Pulse flag when we think there is a pulse
      digitalWrite(blinkPin,HIGH);                // turn on pin 13 LED
      IBI = sampleCounter – lastBeatTime;         // measure time between beats in mS
      lastBeatTime = sampleCounter;               // keep track of time for next pulse
      if(secondBeat){                        // if this is the second beat, if secondBeat == TRUE
        secondBeat = false;                  // clear secondBeat flag
        for(int i=0; i<=9; i++){             // seed the running total to get a realisitic BPM at startup
          rate[i] = IBI;
        }
      }
      if(firstBeat){                         // if it’s the first time we found a beat, if firstBeat == TRUE
        firstBeat = false;                   // clear firstBeat flag
        secondBeat = true;                   // set the second beat flag
        sei();                               // enable interrupts again
        return;                              // IBI value is unreliable so discard it
      }
      // keep a running total of the last 10 IBI values
      word runningTotal = 0;                  // clear the runningTotal variable
      for(int i=0; i<=8; i++){                // shift data in the rate array
        rate[i] = rate[i+1];                  // and drop the oldest IBI value
        runningTotal += rate[i];              // add up the 9 oldest IBI values
      }
      rate[9] = IBI;                          // add the latest IBI to the rate array
      runningTotal += rate[9];                // add the latest IBI to runningTotal
      runningTotal /= 10;                     // average the last 10 IBI values
      BPM = 60000/runningTotal;               // how many beats can fit into a minute? that’s BPM!
      QS = true;                              // set Quantified Self flag
      // QS FLAG IS NOT CLEARED INSIDE THIS ISR
    }
  }
  if (Signal < thresh && Pulse == true){   // when the values are going down, the beat is over
    digitalWrite(blinkPin,LOW);            // turn off pin 13 LED
    Pulse = false;                         // reset the Pulse flag so we can do it again
    amp = P – T;                           // get amplitude of the pulse wave
    thresh = amp/2 + T;                    // set thresh at 50% of the amplitude
    P = thresh;                            // reset these for next time
    T = thresh;
  }
  if (N > 2500){                           // if 2.5 seconds go by without a beat
    thresh = 512;                          // set thresh default
    P = 512;                               // set P default
    T = 512;                               // set T default
    lastBeatTime = sampleCounter;          // bring the lastBeatTime up to date
    firstBeat = true;                      // set these to avoid noise
    secondBeat = false;                    // when we get the heartbeat back
  }
  sei();                                   // enable interrupts when youre done!
}// end isr

 

 

]]>
Final Project – Kinecontrol Modular Music https://courses.ideate.cmu.edu/16-223/f2014/final-project-kinecontrol-modular-music/ Thu, 11 Dec 2014 04:04:22 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3362 2014-12-09 17.53.19

The Team:

Maggie Burke – Fabrication, LightBlue programming, performer

Daniel Hua – Android programmer extraordinaire, camera man

The Project:

The Kinecontrol is a modular music device intended for use in dance and other live performance. The performer wears an Android phone in a standard mic belt used for theatre, and between one and four bands containing a LightBlue Bean around their wrists and ankles. The Kinecontrol uses the accelerometer and gyroscope built into the phone, and the accelerometer built in to the LightBlue Bean to control music in MAX/MSP. Unlike similar music control systems for dance the Kinecontrol is entirely wireless, can be scaled for a larger or smaller number of controllers, and uses devices performance companies either already own, or can easily obtain for under $35.

The Tech:

The Kinecontrol wrist and ankle bands each contain a LightBlue Bean that communicates its accelerometer data to the Android phone worn around the waist. The Android phone then communicates strings of data from its own and the Bluetooth accessory accelerometers to the computer running MAX/MSP over Open Sound Control (OSC) wireless protocol. Daniel Hua created the custom app that runs on the Android phone to manage accelerometer data.

Kinecontrol Video

Original Concept Sketch

Right Wristband

Left Wristband

2014-12-09 17.51.10

Wristbands also fit ankles

2014-12-09 17.44.08

2014-12-09 17.51.44

Wristband outside with LightBlue Bean

 

Kinecontrol Photos

Wristband inside with LightBlue Bean

 

2014-12-09 17.53.32

Mic Belt holding Android phone

2014-12-09 17.49.17

Screenshot 2014-12-08 14.51.45

Proof of concept – Android phone data to MAX/MSP

Screenshot 2014-12-10 22.43.42

Final MAX/MSP Program


 

]]>
Final Project – Mobile Distress https://courses.ideate.cmu.edu/16-223/f2014/mobile-distress/ Thu, 11 Dec 2014 03:11:50 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3348  

Mobile Distress is a machine for cosmetic fracture design. It imprints devices with semi-random fracture patterns – permanent visual histories – in a controlled and safe manner. This process introduces the aesthetics of distress and decay to consumer electronics, and in doing so resists narratives of progress and newness embodied in product design.

parts

The base of the machine is a Pioneer B20Fu20-51FW 8″ Full Range Driver, driven by a SURE AA-AB009 Amp Board, whose coil controls a threaded rod with an attached dremel bit. These parts are housed in a custom, acrylic structure. The bit rests above the phone, which the user manually positions to produce the desired fracturing.

IMG_5973

IMG_5974

]]>
Final Project – RF Sensor https://courses.ideate.cmu.edu/16-223/f2014/rf-sensor/ Thu, 11 Dec 2014 01:03:42 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3321 IMG_5232

Rf Signals

 

Team:

Sara Johnson as designer, integrator, and tutor

Annabelle Lee as scribe, designer, and integrator

 

Introduction:

There are thousands of invisible cell phone signals bouncing around us. Our team set out to visualize and experience the invisible data flying through the air that that consume our attention. We narrowed our focus on rf signal emitted by cellphones, and created a light display that fades and flickers in response to the background noise created by our constant use of cellphones, and dramatically flashes and changes color when a phone sends or receives data nearby.

 

 

Technical Specs:

We used a 16 cm antenna to sense rf signals emitted from phones, emitting signals at a frequency of about 1800 MHz standard GSM (Global System for Mobile Communications). 1 cm of the antenna wire was stripped and exposed to collect signal. It can detect any cell phone activity such as incoming or outgoing calls, texting, and data usage. The sensitivity is adjusted with a potentiometer.

 

The sensed signal is amplified through an op-amp,  and the sensitivity is adjusted with a potentiometer to a level of sensitivity to ignore background noise and react to large packets of data. THe signal is analyzed through the signal the Arduino’s analog input and determines the frequency using arduino timer interrupts.

 

The two analog LED strips are powered by 12V and setup with 6 N-powered mosfets, one mosfet for each color.

 

Circuit Schematic:

rf signal

http://www.jerome-bernard.com/blog/2013/01/12/rgb-led-strip-controlled-by-an-arduino/

 

Pictures:

 

IMG_5233 IMG_5234 IMG_5231 IMG_5226 IMG_5223 IMG_5221

 

Arduino Code:

The below code uses Amanda Ghassaei’s “sine wave freq detection with 38.5kHz sampling rate and interrupts” code available here:

http://www.instructables.com/id/Arduino-Frequency-Detection/

 

 

//clipping indicator variables

boolean clipping = 0;

 

//data storage variables

byte newData = 0;

byte prevData = 0;

 

//freq variables

unsigned int timer = 0;//counts period of wave

unsigned int period;

int frequency;

 

void setup(){

 

Serial.begin(9600);

 

pinMode(13,OUTPUT);//led indicator pin

 

cli();//diable interrupts

 

//set up continuous sampling of analog pin 0

 

//clear ADCSRA and ADCSRB registers

ADCSRA = 0;

ADCSRB = 0;

 

ADMUX |= (1 << REFS0); //set reference voltage

ADMUX |= (1 << ADLAR); //left align the ADC value- so we can read highest 8 bits from ADCH register only

 

ADCSRA |= (1 << ADPS2) | (1 << ADPS0); //set ADC clock with 32 prescaler- 16mHz/32=500kHz

ADCSRA |= (1 << ADATE); //enabble auto trigger

ADCSRA |= (1 << ADIE); //enable interrupts when measurement complete

ADCSRA |= (1 << ADEN); //enable ADC

ADCSRA |= (1 << ADSC); //start ADC measurements

 

sei();//enable interrupts

}

 

ISR(ADC_vect) {//when new ADC value ready

 

prevData = newData;//store previous value

newData = ADCH;//get value from A0

if (prevData < 127 && newData >=127){//if increasing and crossing midpoint

period = timer;//get period

timer = 0;//reset timer

}

 

 

if (newData == 0 || newData == 1023){//if clipping

PORTB |= B00100000;//set pin 13 high- turn on clipping indicator led

clipping = 1;//currently clipping

}

 

timer++;//increment timer at rate of 38.5kHz

}

 

void loop(){

if (clipping){//if currently clipping

PORTB &= B11011111;//turn off clippng indicator led

clipping = 0;

}

 

frequency = 38462/period;//timer rate/period

//print results

Serial.print(frequency);

Serial.println(” hz”);

 

delay(100);

}

 

 

Antenna:

You can switch out antenna lengths according to which frequency of signals you’re looking to capture. Our antenna is 16cm.

2.4 GHz = 12.5 cm  / Bluetooth, WLAN

1800MHz = 16 cm / E Netz GSM

900MHz = 33 cm / D Netz GSM

500 MHz = 60 cm / DVBT K24 Television

100 MHz = 150 cm / FM Broadcast

 

Or, you can calculate using the below equation.

λ=c/f = (300.000km/h)/900MHz =33.3 cm

Then; Antenna Length = λ / 2 = 16.6 cm

 

 

]]>