Uncategorized – 16-375 Work https://courses.ideate.cmu.edu/16-375/f2018/work Robotics for Creative Practice: Student Work Tue, 18 Dec 2018 22:46:33 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.25 The Chairs – Final https://courses.ideate.cmu.edu/16-375/f2018/work/2018/12/18/the-chairs-final/ Tue, 18 Dec 2018 20:10:07 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=1020 The Chairs
Kevin Thies, Nick Richardson, Marisa Lu

The goal for our project was to take a mundane object and give it a slice of humanity. Love is a powerful feeling, so much so that two objects used day to day can try to come together despite how daft the prospect sounds. While we’re used to chairs moving, it’s often by and for humans’ sake, and those chairs have finally taken charge of their abilities.

The second video is reminiscent of two dogs meeting each other while their owners talk. Eventually their wonky, awkwardly eager interactions butt into the people’s conversation.

[Could someone add a reflection on the course and its themes as it relates to the project here?]

The project was very involved on both the hardware and software sides. It’s very different taking an existing object and thinking about how that could be augmented to hold specific parts while remaining as streamlined as possible. It’s something that designing a system from scratch gets around because we aren’t carpenters. It’s not in our skill set or even aligned with the goals of the project to construct chairs. As a result we had to be considerate of the way the chair was constructed so that as we added to it the essence and abilities of the chair remained intact.

There were also a lot of choices that we made that we had to go back on. On the hardware side, our connectors for the nodding hinge at the backrest of the chair kept breaking. The metal wire would wear out as it bent in the same place, and fishing line would come untied. Given another chance, we’d definitely try using zip ties for that, as they’re more rugged and less likely to snap. We also would have used epoxy more to glue the wooden motor box to the plastic undercarriage of the chair instead of gorilla glue, as the one chair that was glued that way held up well. We also leaned to be considerate of where we glued the arduinos to the chair, as if they were touching one of the metal screws, it could short-circuit. If we could have made the chairs battery-powered, that would have solved some issues and we wouldn’t have had to make wheel bumpers, which should have been epoxied on. The hardware lesson to be learned is to build with more extensive use in mind — testing moving parts tends to degrade them.

On the software side, we started off with a system that would better enable parallel work flows. We thought that would be important because we didn’t want to leave the software until after the hardware was done. Ideally it would develop at the same time so that as soon as the hardware was the software could be connected and the rest of the semester would’ve been finessing the interaction. The system we initially developed was two javascript sites running each robot, and communicating both to each other and the hardware (communication to the physical prototype wasn’t achieved). With the javascript, it visualized the robot with simple graphics so that we didn’t need to have the working hardware to test different behaviors. The problem with our visualized system  was that it had it’s own estimates of hard coordinates for the chair, which, given the drift that we know would happen no matter how much we try to calibrate meant the system ultimately needed some rethinking. Of course, the fact that we couldn’t get the wifi module to work was also a huge motivator. In hindsight, designing a system that lynch-pinned on a module no one had experience with before was not the best decision (though it’s a good lesson to learn for the future!). As soon as we transitioned away from the javascript/shiftr system we needed a new way for the chairs to be aware of each other. That’s where the cardinal IR sensor setup on each of the chairs came in. The receptors and beacons would be able to pick up and tell each chair an angle estimate of where the other was. For the most part that orientation to each other was what drove the chairs’ respective behaviors. For example, the servo motor script that nodded the ‘head’ (the back of the chairs could bob) triggered every time the amorous chair was looking straight at the other, as if trying to converse. (connecting wire broke day of show). The coy chair would often try to spin away from the amorous one.  Without distance sensing there was some degree of surprise built into the behavior — in some instances the chairs would bounce off each other or ‘nudge’ the other because the scripted behavior didn’t stop just because of a ‘collision’.

On a more specific note towards how the final software was structured, it had a series of basic fundamental moves that took parameters, and a live global variable taking in sensor info — all other more complicated moves called on those and were generated according to the orientation global.

Some other aspects of the original javascript system were harder than others to carry over — for example, to keep general movement qualities like speed and acceleration, smooth while still variable, all the function moves had numbers that  referenced a global ‘mood’ int that fluctuated smoothly according to perlin noise. Choosing different moves was also dependent on the ‘mood’ variable. Higher the ‘mood’ the was supposed to enable the program to choose what moves we deemed as more ‘eager’ . An implementation of perlin noise was commented out to be dealt with after MVP was achieved, but then after reaching an MVP with switch case statements, it didn’t seem like the perlin noise would change much of the noticeable experience.

 

The code:

https://drive.google.com/open?id=1u2O42gcCeZU0H-b6jabw-6z3_A3YqOO0

]]>
Corporate Machine – Final https://courses.ideate.cmu.edu/16-375/f2018/work/2018/12/18/corporate-machine-final/ Tue, 18 Dec 2018 19:27:50 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=1011 The Corporate Machine

Cindy Deng, Bolaji Bankole, Alan Turner, and Lucy Scherrer

The artistic goal of “The Corporate Machine” is to provoke thoughts from the viewer regarding the institution of capitalism and the effects it has on the average consumer. The image of an arcade-style claw machine has connotations of a rigged system and something that looks easy to win but is secretly very difficult to do successfully. We used this familiar imagery as a way to set the scene for a conversation about capitalism, since it has similar associations of small returns for many. The constant motion of the robot claw evokes the “rat race” mentality that traps many participants in capitalism, and the piling up of coins to no true end despite a false door promising a “prize” shows how there is no real end or escape to the constant cycle of consumerism. As the machine keeps running, there is also a real possibility (as demonstrated during the class showcase) that it will eventually shake itself off the belts or the shafts will come out of alignment, a conclusion that we believe is fitting to depict an economic system that will someday (and has before) cease to function.

We believe that by building a machine that runs on an endless cycle that doesn’t seem to stop until it itself breaks down, but also never gets to reap the benefit of its efforts, we can get the audience to think about the role of capitalism in our society and what effect it has on its participants.

Link to code:

CorporateMachine

Click here for video

]]>
Head in the Clouds – Final https://courses.ideate.cmu.edu/16-375/f2018/work/2018/12/17/head-in-the-clouds-final/ Mon, 17 Dec 2018 21:19:42 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=996 Head in the Clouds

– Final Documentation –

By Evan Hill, Wade Lacey, and Amber Paige

Inspiration and Course Connection:

This piece was inspired by multiple Zimoun pieces, we enjoyed the expressive movements derived from simple mechanics. Our goal for the piece was to create a unique and individual experience for each user. We conveyed a feeling of mesmerizing isolation, similar to how you would feel lying on the ground and looking up at the clouds.

The theme of this course revolved around using software and hardware techniques to create expressive dynamic behaviors, while encompassing the following question: what does it mean to be surprisingly animate? We believe that we contributed to this theme by creating a space which is surprisingly animate within the confines of the enclosure than you would assume by looking at it from the outside. By using simple materials, such as motors and paper, to generate movement and by creating an immersive environment drawing on sight, sound, touch, and even smell, our piece had a intriguing and unique user experience.

 

Construction:

To create the robot we built an enclosure made out of plywood and 2’x4’ pieces of lumber that encompassed the top half of the viewer and was supported on four legs, also made from 2’x4’s. Inside, there was lots of brown crumpled paper that was actuated by 40 servo motors. Our actuation system was comprised of these hobby servo motors and multiple micro-controllers. Each wall of the enclosure housed 9 panels, each with their own servo, all of which were hooked up to one mini-maestro, mounted to the side of the corresponding wall. The ceiling of the structure also had 4 panels, each of these panels was connected to the 10th channel of the mini maestro for one wall. The four mini-maestros were controlled by a single laptop via a usb port splitter. The Maestro Control Center installation was used to write the positions of the servos and put on a loop throughout the performance. Above the viewer in the enclosure, we mounted LED lights to the corners of the structure to add depth and vary color during the performance. The keep the experience private and surprising, we stapled black canvas to the exterior of our structure. This prevented onlookers from seeing what was happening inside without going in themselves as well as kept them focused on their personal experience with the enclosure once inside. Additional crumpled paper was hot glued in between panels to keep the modes of actuation unknown to the user.

The Experience:

The experience began with the viewer approaching the piece as a large plain enclosure from the outside. To enter they had to crouch down and walk/crawl into the enclosure or sit below it. Once inside they were fully surrounded by abstract  moving crumpled paper forms and color shifting lights. The paper forms twisted and crumpled at differing rates over the time of the viewer’s interaction. The space was designed to be reactive in a way that created a different feeling in each user; some felt suspense or overwhelmed from the constant motion while others were calmed by the experience. The depth of the space felt greater than the physical dimensions of the enclosure.

Outside view:

Inside view:

 

Feedback:

The feedback we got from our project was varied and cool to hear overall. While most people had something different to say coming out of our enclosure, almost everyone had the same question before going in, “What’s in the box?”. Our goal was to have the experience be unique to the user and the feedback we received shows that we achieved that. Some people were overwhelmed by the sensory stimuli of sight of paper, sound of paper crumpling, feel of the wood and paper, and smell of laser cut wood. On the other hand, others found the experience to be calming and some even spent prolonged periods of time lying beneath the enclosure to look up at it. We heard one user say, “I wish I could sleep in here”.

It was really cool to watch people spend leisure time in the enclosure, just having conversations with their friends, during lulls in the show. Additionally, while the experience was initially meant to be private, we found that people enjoyed the shared experience as well. Strangers and friends would get in alongside each other and talk about the piece together, comparing their initial reactions.

The comments we got from people walking out of the box were really fun to hear. It ranged from looking like “Oogie Boogie from Nightmare Before Christmas” to “the inside of an alien womb” and many more quotes. People were also very befuddled by and interested in how our piece worked. All in all, we believe that we succeeded in creating a mesmerizing experience unique to each user.

 

]]>
Kite: Final Documentation https://courses.ideate.cmu.edu/16-375/f2018/work/2018/12/16/kite-final-documentation/ Sun, 16 Dec 2018 23:47:38 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=982 Martha Cryan, Xin Hui Lim, Tara Molesworth

Our project explored the dynamics of fabric together with wind, using kites as inspiration. Chaotic as well as floaty gestures could arise from small parameter changes, showcasing the stretchability and lightness of the material. All together, the effect of the performance was a dynamic creature-like-kite, flying and diving through the air.

To fly the fabric, 4 motors were fitted with aluminum arms. The arms were each tied at the end with fishing wire, which held a small piece of fabric (like a four-stringed kite) over a horizontal fan. The fan was controlled by a dmx, allowing for control of the fan speed as well as arm movement parameters.

The setup:

We used a micro-stepper CNC Shield board that was fitted over an Arduino Uno. The CNC Shield was able to hold 4 stepper drivers, which was just enough for our purposes. Each stepper motor, with extended wiring, was connected to one of the four ports – X, Y, Z and A. The Arduino Uno was powered by a 12V supply. We also connected the fan to a DMX box, that was connected to a DMX/USB interface (ENTTEC), and plugged into a power source. Both the USB ports from the Arduino and DMX were then connected to a Raspberry Pi. Finally, the Raspberry Pi was connected to the laptop which ran the code.

 

We cut 7/32” aluminum tubes into 15” lengths and drilled three 3/32” holes for each “arm” – two of which would be screwed onto the hub connector that was then secured onto the stepper motor with a mini counterscrew, one of which we tied a fishing line with the other end attached to one of the corners of the fabric.

 

We also lasercut two wooden stands that was painted with black acrylic paint, which the fan could be secured on.

 

The process:
We had previously experimented with the the type of mechanism (spooling, same string between two corners), the length of arms, as well as fabric sizes. Some things were considered included whether the arms created enough force, the sound of the stepper motors and fan, and the tone of the piece. Ultimately, we went for a more playful piece with a smaller fabric that had a larger range of motion, rather than a big fabric with more subtle movements because the single fan setup made it harder for the viewer to distinguish between different states.

During the experiment phase, we had used 4’ long wooden planks to keep the stepper motors at the same distance, but in the final presentation we secured them onto linoleum blocks, minimizing the distraction from the kinetic fabric piece.

 

We had also previously coded in a way for us to quickly change the stepper motor target positions, the stepper motor speeds, and the fan speed, by connecting a MIDI/Alias controller to our laptop. It was useful to learn about the hardware interface that could be implemented for experimentation, although we ultimately found it more useful to code the actual performances since it was not that hard to translate what we conceptualized for each performance state/behavior into code.

 

The performances:
At the start, all arms move inwards, hitting the ground and moving back to vertical position. The fan speed slowly increases. (State 0)

The code then runs itself in a loop (State 1 – 10).

State 1: Warming up – Opposite corners took turns to move slowly and slightly

State 2: Breathing – Fabric is held up vertically, and arms move slightly inwards and outwards at the same time.

State 3: Walking – All four arms were moving, with one opposite pair moving inwards and the other moving outwards.

State 4: Parabola – Opposite corners moving inwards at the same time, or outwards.

State 5: Jumping – Quick movements, randomly generated.

State 6: Swaying – Opposite corners moved slowly in a choreographed manner such that the fabric would jump between corners.

State 7: Resting – Fan speed slowly decreases, until just before the fabric falls, and quickly increases.

State 8: Jumping – Quick movements, randomly generated.

State 9: Spiral – Each arm took turns to wave/jerk at high speeds.

State 10: Flailing – Arms were held in fixed position such that the fabric was tilted upwards on one side, with only the fan speed changing to create movements in the fabric.

Source Code:


#!/usr/bin/env python

"""\
test_client.py : sample code in Python to communicate with an Arduino running CNC_Shield_Server

Copyright (c) 2015, Garth Zeglin. All rights reserved. Licensed under the terms
of the BSD 3-clause license.

"""

#================================================================
from __future__ import print_function
import argparse
import time

# This requires a pySerial installation.
# Package details: https://pypi.python.org/pypi/pyserial,
# Documentation: http://pythonhosted.org/pyserial/
import serial
import numpy as np

# This requires a pySerial installation.
# Package details: https://pypi.python.org/pypi/pyserial
# Documentation: http://pythonhosted.org/pyserial/
import serial
from serial import Serial

# from rtmidi.midiutil import open_midiinput

class MidiInputHandler(object):
def __init__(self, port, dmx, motors):
self.port = port
self._wallclock = time.time()
self.dmx = dmx
self.motors = motors
self.pX = 0
self.pY = 0
self.pZ = 0
self.pA = 0
self.eventQueue = []
self.updateTime = time.time()

## TO EDIT THE SLIDERS
## message is a 3 element array:
## first element doesn't matter
## second element is which slider
## third element is the value
def __call__(self, event, data=None):
message, deltatime = event
self._wallclock += deltatime
# if deltatime < 0.2:
# self.eventQueue.append(event)
# return
# print("[%s] @%0.6f %r" % (self.port, self._wallclock, message))
# self.updateTime = self._wallclock
if message[1] == 8:
self.motors.state = 0
elif message[1] == 9:
self.motors.state = 1
elif message[1] == 10:
self.motors.state = 2
elif message[1] == 11:
self.motors.state = 3
elif message[1] == 11:
self.motors.state = 4

#================================================================
class DMXUSBPro(object):
"""Class to manage a connection to a serial-connected Enttec DMXUSB Pro
interface. This only supports output.

:param port: the name of the serial port device
:param verbose: flag to increase console output
:param debug: flag to print raw inputs on sconsole
:param kwargs: collect any unused keyword arguments
"""

def __init__(self, port=None, verbose=False, debug=False, universe_size=25, **kwargs ):

# Initialize a default universe. This publicly readable and writable.
# The Enttec requires a minimum universe size of 25.
self.universe = np.zeros((universe_size), dtype=np.uint8)

# Initialize internal state.
self.verbose = verbose
self.debug = debug
self.portname = port
self.port = None
self.output = None
self.input = None

return

def is_connected(self):
"""Return true if the serial port device is open."""
return self.port is not None

def set_serial_port_name(self, name):
"""Set the name of the serial port device."""
self.portname = name
return

def open_serial_port(self,port):
"""Open the serial connection to the controller."""

# open the serial port
self.port = serial.Serial( port, 115200 )
if self.verbose:
print("Opened serial port named", self.port.name)

# save separate copies of the file object; this will ease simulation using other sources
self.output = self.port
self.input = self.port
return

def flush_serial_input(self):
"""Clear the input buffer."""
if self.input is not None:
self.input.flushInput()

def close_serial_port(self):
"""Shut down the serial connection, after which this object may no longer be used."""
self.port.close()
self.port = None
return

def send_universe(self):
"""Issue a DMX universe update."""
if self.output is None:
print("Port not open for output.")
else:
message = np.ndarray((6 + self.universe.size), dtype=np.uint8)
message[0:2] = [126, 6] # Send DMX Packet header
message[2] = (self.universe.size+1) % 256 # data length LSB
message[3] = (self.universe.size+1) >> 8 # data length MSB
message[4] = 0 # zero 'start code' in first universe position
message[5:5+self.universe.size] = self.universe
message[-1] = 231 # end of message delimiter

if self.debug:
print("Sending: '%s'" % message)
self.output.write(message)
return

def speed_change(self,speed):
print("dmx changing speed to "+str(speed))
dmx.universe[0] = speed
dmx.universe[2] = speed-50
dmx.send_universe()

#================================================================
class CncShieldClient(object):
"""Class to manage a connection to a CNC_Shield_Server running on a
serial-connected Arduino.

:param port: the name of the serial port device
:param verbose: flag to increase console output
:param debug: flag to print raw inputs on sconsole
:param kwargs: collect any unused keyword arguments
"""

def __init__(self, port=None, verbose=False, debug=False, **kwargs ):
# initialize the client state
self.arduino_time = 0
self.position = [0, 0, 0, 0]
self.target = [0, 0, 0, 0]
self.verbose = verbose
self.debug = debug
self.awake = False

# open the serial port, which should also reset the Arduino

self.port = serial.Serial( "/dev/ttyACM0", 115200, timeout=5 )
# self.port = serial.Serial( "/dev/tty.usbmodem1421", 115200, timeout=5 )
# self.port = serial.Serial( "COM4", 115200, timeout=5 )

if self.verbose:
print("Opened serial port named", self.port.name)
print("Sleeping briefly while Arduino boots...")

# wait briefly for the Arduino to finish booting
time.sleep(2) # units are seconds

# throw away any extraneous input
self.port.flushInput()

return

def close(self):
"""Shut down the serial connection to the Arduino, after which this object may no longer be used."""
self.port.close()
self.port = None
return

def _wait_for_input(self):
line = self.port.readline().rstrip().decode('utf-8')

if line:
elements = line.split(' ')
if self.debug:
print("Received: ")
print(elements)
print("Position:")
print(self.position)

if elements[0] == 'txyz':
self.arduino_time = int(elements[1])
self.position = [int(s) for s in elements[2:]]

elif elements[0] == 'awake':
self.awake = True

elif elements[0] == 'dbg':
print("Received debugging message:", line)

else:
if self.debug:
print("Unknown status message: ", line)

return

def _send_command(self, string):
if self.verbose:
print("Sending: ", string)
self.port.write( str.encode(string+'\n'))
self.port.flushOutput()
self.port.flushInput()
return

def motor_enable( self, value=True):
"""Issue a command to enable or disable the stepper motor drivers."""

self._send_command( "enable 1" if value is True else "enable 0" )
return

def wait_for_wakeup(self):
"""Issue a status query and wait until an 'awake' status has been received."""
while self.awake is False:
self._send_command( "ping" )
self._wait_for_input()

def move_to(self, position):
"""Issue a command to move to a [x, y, z, a] absolute position (specified in microsteps) and wait until completion.

:param position: a list or tuple with at least three elements
"""
self._send_command( "goto %d %d %d %d" % tuple(position))
# self.target = position

# while self.position[0] != position[0] or self.position[1] != position[1] or self.position[2] != position[2] or self.position[3] != position[3]:
# try:
# self._wait_for_input()
# except:
# print("Error reading!!")
# if self.verbose:
# print ("Position:", self.position)

# self.moving = False
return

def speed_change(self, speed):
self._send_command( "sc %d %d %d %d" % (speed,speed,speed,speed))

#================================================================

# The following section is run when this is loaded as a script.
if __name__ == "__main__":

# Initialize the command parser.
parser = argparse.ArgumentParser( description = """Simple test client to send data to the CNC_Shield_Server on an Arduino.""")
parser.add_argument( '-v', '--verbose', action='store_true', help='Enable more detailed output.' )
parser.add_argument( '--debug', action='store_true', help='Enable debugging output.' )

# Parse the command line, returning a Namespace.
args = parser.parse_args()

dmx = DMXUSBPro(**vars(args))
dmx.open_serial_port("/dev/ttyUSB0")
# dmx.open_serial_port("/dev/tty.usbserial-EN199298")

client = CncShieldClient(**vars(args))
client.moving = False

print("Waiting for wakeup.")
client.wait_for_wakeup()

print("Beginning movement sequence.")
client.motor_enable()

# Begin the lighting sequence. This may be safely interrupted by the user pressing Control-C.
try:
print("Beginning lighting sequence.")

speed = 150
direction = 1
motorspeed=100
posX = posY = posZ = posA = 0
client.state=9
new=[0,0,0,0]
count=0
x=200
client.move_to([0,0,0,0])

#reset position and slowly increase fan speed
seq0= [[0,0,0,0],[-80,80,80,-80],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0]]
fan0 = [100,100,100,120,140,150,160,170,160,140]

# breathing low
seq1=[[-20,20,20,-20],[-20,20,20,-20],[-10,10,10,-10],[-10,10,10,-10],[10,-10,-10,10]]
speed1 = [20,20,2,2,2]
fan1=[100,100,100,120,120]

#breathing high
seq4=[[-50,50,50,-10],[0,0,0,15],[30,-30,-30,15],[0,0,0,-15],[30,-10,-10,30],[0,0,0,-15],[30,-30,-30,15],[0,0,0,-15],[30,-10,-10,15],[0,0,0,0],[30,-10,-10,30]]
speed4 = [50,50,25,25,25,25,25,25,25,25,25,25]
fan4 = [160,160,150,150,140,140,130,130,120,120,140,140]

# walking
seq3=[[40,60,60,30],[-20,-10,-10,-30],[40,60,60,30],[-20,-10,-10,-30],[40,60,60,30],[-20,-10,-10,-30]]
speed3 = [10,30,30,30,10]
# fan3=[150,150,140,140,120,180]
time3=[5,3,2,3,5]
fan3=[140,180,140,180,140,180]

#x corner
seq2=[[30,0,0,0],[0,0,0,0],[-30,0,0,0],[0,0,0,0],[30,0,0,-30],[0,0,0,-30],[-30,0,0,-30],[0,0,0,-30],[20,0,0,-30],[0,0,0,-30],[-10,0,0,-30],[0,0,0,-30]]
# speed4 = [20,20,20,20,20,20,20,20,20,20,20,20]
speed2 = [50,30,30]
fan2=[120,120,120,120,120,120,120,120,120,120,120,120]

#moving quickly between corners, x&a
seq5=[[-20,0,0,20],[0,0,0,0],[40,0,0,-40],[0,0,0,0],[-40,0,0,40],[0,0,0,0],[60,0,0,-60],[0,0,0,0],[60,0,0,-60],[0,0,0,0],[60,0,0,-60],[20,0,0,20],[40,-20,0,40],[40,0,0,40],[40,-40,-10,40],[40,0,-20,40],[40,40,40,40],[40,0,0,40],[40,60,-60,40],[40,0,0,40],[40,60,-60,40],[40,0,0,40],[40,60,-60,40],[40,0,0,40]]
speed5 = [40,100,40]
fan5=[140,140,140,140,140,140,120,120,120,120,120,120,140,140,140,140,140,140,120,120,120,120,120,120]

#falling and getting up
seq7=[[-80,80,80,-80],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0],[0,0,0,0]]
speed7=[50,5,5,5,5,5]
fan7=[200,200,160,120,120,100]

#fan speed change only
seq10=[[-20,20,60,-60],[-20,20,60,-60],[-20,20,60,-60],[-20,20,60,-60],[-20,20,60,-60],[-20,20,60,-60]]
fan10=[80,110,140,80,110,140]

# motion directly dependent on fan speed
while True:

if client.state==0:
print("-------------state0-------------")
client.speed_change(150)

for i in range(len(seq0)):
dmx.speed_change(fan0[i])
client.move_to(seq0[i])
time.sleep(i*0.5)
# maybe time.sleep can vary with sensor input

client.state=6

if client.state==1:
print("-------------state1-------------")
client.speed_change(20)
# falling

for j in range(3):
for i in range(len(seq1)):
dmx.speed_change(fan1[i])
client.speed_change(speed1[i])
client.move_to(seq1[i])
time.sleep(i*2)

client.state=2

# moving between corners
if client.state==2:
print("-------------state2-------------")
for j in range(len(speed2)):
client.speed_change(speed2[j])
for i in range(len(seq2)):
dmx.speed_change(fan2[i])
client.move_to(seq2[i])
time.sleep(2)

client.state=3

#walking
if client.state==3:
print("-------------state3-------------")
for j in range(len(speed3)):
client.speed_change(speed3[j])
for i in range(len(seq3)):
dmx.speed_change(fan3[i])
client.move_to(seq3[i])
time.sleep(time3[j])

client.state=4

# all moving in slightly / breathing
if client.state==4:
print("-------------state4-------------")
for j in range(3):
for i in range(len(seq4)):
dmx.speed_change(fan4[i])
client.speed_change(speed4[i])
client.move_to(seq4[i])
time.sleep(1)
client.state=8

if client.state==5:
print("-------------state5-------------")

client.move_to([0,0,0,0])
for i in range(50):
client.speed_change(200)
client.move_to([-np.random.randint(50),np.random.randint(80),np.random.randint(80),-np.random.randint(50)])
time.sleep(0.5)

client.move_to([0,0,0,0])
client.state=6

#low fan, quick movement
if client.state==6:
print("-------------state6-------------")
for j in range(len(speed5)):
client.speed_change(speed5[j])

for i in range(len(seq5)):
dmx.speed_change(fan5[i])
client.move_to(seq5[i])
time.sleep(0.5)
# time.sleep(np.random.random_sample()*3)
client.state=7

if client.state==7:
print("-------------state7-------------")

client.move_to([0,0,0,0])
for i in range(50):
client.speed_change(200)
client.move_to([-np.random.randint(50),np.random.randint(80),np.random.randint(80),-np.random.randint(50)])
time.sleep(0.5)

client.move_to([0,0,0,0])
client.state=8

if client.state==8:
print("-------------state8-------------")
for i in range(len(seq7)):
dmx.speed_change(fan7[i])
client.move_to(seq7[i])
time.sleep(4)
client.state=9

if client.state==9:
print("-------------state9-------------")
#get it to fall
dmx.speed_change(200)
client.speed_change(200)
client.move_to([0,0,0,0])
time.sleep(0.5)
client.move_to([-80,80,80,-80])
time.sleep(1)
dmx.speed_change(10)
time.sleep(3)
fanSpeed = 10
#increase fan speed
while fanSpeed < 200:
fanSpeed += 10
dmx.speed_change(fanSpeed)
time.sleep(0.5)

#spiral movements progressively get wider
pos = 0
dmx.speed_change(140)
for i in range(15):
#X moves
client.move_to([-5*i,0,0,0])
time.sleep(0.25)

#Y moves
client.move_to([0,5*i,0,0])
time.sleep(0.25)

#A moves
client.move_to([0,0,0,-5*i])
time.sleep(0.25)

#Z moves
client.move_to([0,0,5*i,0])
time.sleep(0.25)

client.state=10

if client.state==10:
print("-------------state10-------------")
# client.speed_change(100)
# client.move_to([-80,80,80,-80])
time.sleep(1.5)
client.speed_change(20)

for j in range(3):
for i in range(len(seq10)):
dmx.speed_change(fan10[i])
client.move_to(seq10[i])
time.sleep(2.5)
client.state=1

except KeyboardInterrupt:
client.move_to([0,0,0,0])
print("User interrupted motion.")

# Close the port. This will not the stop the dmx if still in motion.
dmx.close_serial_port()

# Begin the motion sequence. This may be safely interrupted by the user pressing Control-C.

# Issue a command to turn off the drivers, then shut down the connection.
client.motor_enable(False)
client.close()

]]>
Anti-Drawing Machine Final https://courses.ideate.cmu.edu/16-375/f2018/work/2018/12/13/anti-drawing-machine-final/ Thu, 13 Dec 2018 20:44:29 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=968

The Anti-Drawing Machine

Harsh, Akshat, Soonho

Video

Final Setup

For our final exhibition, we perfected our movement and sealed everything inside of a clean box. We wanted to have paper available for people to take and draw on, so we cut 30 sheets of 18″ x 18″ paper. The lighting was set so that all the focus was on the paper.

We wanted to embrace this “child-like” aesthetic because of how the drawings looked, so we chose to use Crayola markers!

We also chose to stick up all the drawings on the back. There was an interesting effect where the back wall started to have that “refrigerator drawings” feeling.

Observation

It was really interesting to observe how different people used the machine differently. Most people made very gestural and pattern-based drawings, but other people treated it like a game where they were determined to finish their original drawings.

There were some collaborative drawings

And some innovation/ experimentation with different marker combinations.

There were many strokes and effects that could not have otherwise been created without this machine!

Feedback

There was a spectrum of feedback, from mild frustration to wild enjoyment. Most of the participants did not recognize any sort of logic in the robot motion unless they specifically asked about it. This was largely due to the focus mainly being on the drawing itself.

Ultimately, everyone who tried the machine told us they had a fun time and were extremely pleased to have created a drawing that is not necessarily perfect. This machine seemed to alleviate the expectations to make drawings with extreme craft, and it kind of “evened the playing field” for all participants.

Code

]]>
To Infinity and Beyond: Mini Project 2 https://courses.ideate.cmu.edu/16-375/f2018/work/2018/11/18/to-infinity-and-beyond-mini-project-2/ Sun, 18 Nov 2018 18:41:25 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=961 To Infinity And Beyond

Goals:

My goal for the robot is to have the arm repeatedly making an infinity symbol with the tip of the elbow. I decided to make this movement to represent infinity, a concept that I personally find interesting. Using the arm and the infinite motion itself, I was intending it to be similar to the famous French painting The Treachery of Images, the one with a pipe claiming that it is not a pipe. The meta message here is that the arm is that the arm is trying to draw the infinity symbol, but at the same time the motion itself is already a representation of the concept of infinity.

Outcomes

I went through the process of finding the right angle between the arm and the elbow to be able to control the shape of its end vector. I also did some modification to the target to make the two sides of the infinity sign balance out in size.

Files:

double_pendulum

]]>
Project Work Plan—Anti-Drawing Machine* https://courses.ideate.cmu.edu/16-375/f2018/work/2018/10/09/project-work-plan-anti-drawing-machine/ Tue, 09 Oct 2018 19:42:36 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=937 1.  Concept Statement

Title:
Anti-Drawing Machine / Haunted Drawing Board

Artistic Motivation: 
The practice of drawing itself is an act of human creativity and expression—it can be highly linked to one’s unique identity. We want to intervene and add another character in the drawing experience by disrupting it with an unexpected, seemingly mischievous character that reduces the precise control that the artist has over their canvas. We are extremely curious to explore how dynamic surfaces can lead to different manifestations of art.

Narrative Description:

The robot itself is a drawing table, and it would have a stack of ordinary 8.5″ x 11″ paper sitting nearby. The participant would be prompted to take a single sheet and place it on the machine. They would be prompted to draw ___

Justification:

To us, a big part of this class is gaining an understanding how the characteristics of robotics and embodied behavior can be used to intrigue and alter human behavior. We believe that by taking a paradigm that is already established (the practice of drawing) and interfere with it, we can highlight and ascribe character to this robot.

2. Sketches

3. Critical Path Analysis

4. Schedule of Milestones

1. Detailed CAD drawings for part fabrication: This weekend [10/12 – 10/14]

2. Software architectural design: Through next week [10/15 – 10/20]

3. Purchasing orders: ASAP [10/21-10/22]

4. Proof-of-concept demonstration: After next weekend [10/26-10/28]

5. Prototype. The weekend after [11/9 – 11/11]

6. Behavior demonstration. Two weekends after [11/30]

7. Artifact. The weekend after ASAP with finals after 11/30

 

5. Bill of Materials

-Stepper motor (2)  [link]
-CMUcam(2) [link]
-Garth’s CV-enabled Camera
-Stepper Motor Anchor (2)  [Laser Cut 6mm Plywood]
-Drawing Base(1) [Laser Cut 6mm Plywood]
-CV Camera Holder (1) [Laser Cut 6mm Plywood]
-Miscellaneous Nuts and Bolts 
-Wood Primer(2)
-Spray Paint(2)
-Raspberry-pi Stepper and DC Hat [link]
-Raspberry-pi.

6. Draft Budget – $200

Plywood Sheets – $30
2 * CMUCAM – $100
Raspberry Pi – $40
Stepper Motor – $20
Pi Hat – $30
Misc – $30

7. Proof of Concept

]]>
Project Plan: The Corporate Machine™ https://courses.ideate.cmu.edu/16-375/f2018/work/2018/10/09/project-plan-the-corporate-machine/ Tue, 09 Oct 2018 18:18:27 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=929 Robotics for Creative Practice: Final Project

The Corporate Machine™ : Alan Turner, Cindy Deng, Bolaji Bankole, Lucy Scherrer

We started brainstorming by examining the concept of an imperfect robot. We were interested in exploring what it mean

s if the completion of the robot’s task is intrinsic to its own destruction, or if the task itself seems pointless. We felt like there were many layers of meaning to a robot that was clearly programmed by someone to do a task, but the task was either detrimental to the robot’s physical well being, physically not possible for the robot to complete, or something ridiculously pointless that makes the viewer ask questions about the reasoning behind the robot’s conception.

In thinking about the form of the robot, we also starting to think about what message we could share in the creation of a pointless or self-destructive robot. This concept lends itself to talking about things that are cyclical, self-inflicted, and oppressive. We discussed the idea of voluntarily buying in to a destructive process, as the robot would be “willingly” (or under the illusion of free will) performing this task. This made us think of the art piece “Capitalism Works for Me” displayed at CMU last year by artist Steve Lambert.

Inspiration:

We drew our inspiration from the concept of machines that either can’t perform the required task or that comment on the emptiness or meaningless nature of their task in its execution or termination. An example of this is a recent work by the artist Banksy, which was a painting fitted with a shredder in its frame so that it would self destruct upon being bought, which it was for $1.4 million.

When we talked about self-destructive cycles, an idea that quickly came up was carnival claw games. They are enticing and seem potentially very profitable, but it’s a known fact that most of them are either scams or just very difficult. Nevertheless, most of us have at least once put money in the machine just to predictably walk away with nothing. We would like to use the familiar image of a claw machine as a representation of something that is tempting but ultimately a waste to talk about the vicious cycle of capitalism.

User story / experience description

The user will first notice the familiar form and visual aesthetic of an arcade claw machine. They will then realize that the machine is filled with money tokens, and that the claw is automated with a preset task of winning one of those tokens. After the claw picks up a token, a task that will likely be somewhat difficult and will produce a performance of working hard to accomplish a given task. Once the machine finally picks up a coin, it will drop it into the chute that usually gives the player the prize once it has been won. However, once it goes down the chute it will roll down a curved ramp that wraps around from the back to the front of the machine, and will eventually deposit it right back into the original money slot, starting the machine’s cycle over again. The way the coin drops into the chute and rolls down the slide to be inserted back into the slot is a big part of the machine’s performance, as it is the transition mode between what will probably be a long tedious process of picking up a coin.

Justification

By using the familiar concept of a claw machine and copying the brightly colored, arcade aesthetic, we believe we can create a piece that facilitates a conversation about capitalism and everyone’s role in it. The performance created by the difficulty of the machine to pick up a coin and the persistence it has in completing its task, regardless of the fact that the task will only facilitate another task that needs to be completed, gives us an interesting motion that could generate important conversations.

 

Required Resources

We will build the box from wood and acrylic. We will program the claw controller with an arduino, and use pneumatics to facilitate the movement of the claw and gantry.

We have found tutorials that detail how to build a functioning claw machine.

Online Reference Tutorials

1) All cardboard: https://www.youtube.com/watch?v=16MVPbX2D1M

  • Uses only cardboard
  • Used colored liquid and syringe
  • Simple and cheap
  • Human operated
  • Can copy its claw design to save money

2) Instructable Arduino machine:

 

3) Instructable claw machine:

Complete Inspiration/Concept Doc: https://docs.google.com/document/d/1IaAgcEvQSAPElhGsyK4iajJoTBz6xqqf5CcQ7bHVdUs/edit?usp=sharing

 

Tentative Bill of Materials: https://docs.google.com/spreadsheets/d/1eVIVccV6ddtWG9GdTNytcZu_d2M-6q04A1epIoKyB3g/edit?usp=sharing

Tentative Schedule: https://docs.google.com/spreadsheets/d/1a-DKMc-WEbsHTBH2toEKTikKpcTGsSJdaxbFZQ8czFw/edit?usp=sharing

 

]]>
Project Work Plan: Voice Air Balloon https://courses.ideate.cmu.edu/16-375/f2018/work/2018/10/09/project-work-plan-voice-air-balloon/ Tue, 09 Oct 2018 16:11:00 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=915 An artistic motivation for the piece
We are interested in creating an interactive and immersive experience for the user. We want to encourage any user to make sounds and control their experience on the inside, while viewers on the outside get to watch as the balloon with human feet expands, falls, twists, and emanates sound.

The user will crawl into the relaxed balloon (at “rest” the balloon is spinning slowly in one direction for some x distance, and once it reaches x the balloon slows to zero and turns in the opposite direction to negative x) and find a small microphone hanging from the top of the fabric. When the microphone picks up sound from the user, the balloon will be pulled up by a string. When the fabric is fully extended up, the balloon fabric will be released and fall like a parachute. The voice can also control the speed of rotation, and depending on where the balloon is on it’s path of travel, the user may see the balloon slow to 0 and then start spinning the other way.

A narrative description of what the viewer or participant might experience
A rotating curtain-like suspended fabric catches the viewer’s attention – there seems to be a certain rhythm to its twisting, giving it an animate quality. He watches for a minute or so – it twists itself by rotating continually in one direction, before untwisting, and twisting in the other direction, repeating over time but the untwisting seems to be different each time. He decides to move closer to the piece. Someone invites him and his friend to interact with the piece by going under the suspended fabric – his friend dares him to do it. As the fabric’s movement slows, he bends down and crawls under it. On the inside of the fabric, he feels safe. He touches the fabric. He sees a sort of microphone or sound sensor hanging above him. Letting out basic sounds, he slowly increases the amplitude of his voice. The fabric seems to bounce up and down more dramatically when he’s louder, and rotates more quickly when he lets out higher frequency sounds, creating a “ballooning” effect as it continues to rotate clockwise/anticlockwise. His friend talks to him from outside, prompting him to clap and sing, taking a video in the meantime. As the fabric slows down, he pushes himself out, telling his friend to try it as well.

A brief justification how the project relates to the course themes

We are interested in exploring the idea of imprecision in relation to kinetic movement. The piece has a number of inputs and variables that are controlled (rotation, vertical translation) such that it moves mechanically and rhythmically with the application and non-application of forces, but also interacts with “natural” forces like gravity, stretchiness, springiness of materials, such that the performance is never the same.

The material properties of fabric allow it to be manipulated in many ways by the human body, but limiting it to fewer degrees of freedom of a machine could emphasize and further investigate a certain subset of behaviors. The interaction of the participants will further enhance the “animating” of the piece, where fabric interprets the participant’s interaction into a performance.

The machine could learn to maximize the “ballooning” effect (finding the optimal height and velocity of how the suspension point at the top for the corresponding rotational velocity/state), with a handful of accelerometers built into the fabric.

A description of the physical resources required: props, materials, mechanisms, electronics, computation
Fabric – opaque, maybe patterned. Soft, smooth, reflects sound, light – able to “balloon”.
If top-suspended only, include decorations along the fabric to vary the weight.
If rotated from the bottom,
Sound sensor
Rope
Gear / servo motor

We have a few ideas for the mechanics of how the fabric is supported and manipulated.

  1. Suspended from the top
  2. Rotating hula hoop at the bottom
  3. Orbital pendulum https://www.youtube.com/watch?v=QVquCtrCdq

Critical Path Analysis / Project Graph

Bill of Materials

Proposed schedule 

https://docs.google.com/spreadsheets/d/1rfH8pVIE8v9PF2JGKrPwE4ix5qlqeFfJzs4Sn9cF5a0/edit#gid=17306459

]]>
Project Work Plan: Head in the Clouds https://courses.ideate.cmu.edu/16-375/f2018/work/2018/10/07/project-work-plan-head-in-the-clouds/ Mon, 08 Oct 2018 00:30:17 +0000 https://courses.ideate.cmu.edu/16-375/f2018/work/?p=906 Project Work Plan: Head in the Clouds
Wade Lacey, Amber Paige, Evan Hill

Our final project is titled “Head in the clouds”. This piece was inspired by a Zimoun piece, we enjoyed the expressive movements derived from simple mechanics. Our goal for the piece is to create a unique and individual experience for each user. We hope to convey a feeling of mesmerizing isolation, similar to how you would feel lying on the ground and looking up at the clouds.

The viewer will first see the piece as a large plain enclosure from the outside. As they begin their interaction, they will crouch down and walk/crawl into the enclosure. Once inside they will be fully surrounded by abstract paper forms and lights that are stagnant and seemingly inanimate. The user will either grab a sensing handle or put on a torso strap to allow for their breathing to be monitored. Once the user is registered by the device, the paper forms and diffuse lights will begin to expand and glow in time with the user’s breathing. The space will be reactive in a way that creates a feeling of deep connection with the user. The surroundings will almost seem like an extension of the user’s own body. It will expand and contract as the user breathes creating a space with a constantly changing shape, size, and color. The depth of the space will almost seem greater than the physical dimensions of the enclosure. The theme of this year’s course revolves around using software and hardware techniques to create expressive dynamic behaviours, while encompassing the following question: what does it mean to be surprisingly animate? We believe that we contribute to this theme by creating a space which is surprisingly animate within the confines of the enclosure than you would assume by looking at it from the outside. By using simple materials, such as motors and paper, to create movement and by linking their movements to the user, our piece aims to develop a relationship between the user and the space.

To create this robot we will build an enclosure made out of plywood that will encompass the top half of the viewer and will be supported on 4 legs.  Inside we will have lots of paper (typical white paper as well as tissue paper) that will be actuated. For this we need the paper, an actuation system comprised of hobby servo motors and multiple microcontrollers.  To actuate the paper in a more animate way we will create a structure that will be moved by the motors and themselves push/pull on the paper in different ways. To build this component we will need plywood that will be laser cut into the desired designs.  Behind the paper along some areas of the walls of the enclosure we will have LED strip lights to add more depth. The sensor system in our robot will be a breath sensor which will be a chest strap that will be worn by the person experiencing our piece, the input from this will feed into our microcontrollers to effect the movement of the paper.  

The two most import parts of our project that will determine the overall success is the sensing of the user’s breathing and the mechanism to actuate the paper forms. Once these are designed, built, and tested we simply need to scale the mechanism to fill the entire enclosure. We plan to first experiment with paper to determine how we will make our forms and what mechanism we will use to create the expanding and contracting movement. Next, we will integrate a singular mechanism with the sensor to ensure we can create a sense of connection through monitoring the user’s breathing. The next step will be to design the structure of the enclosure in a way that makes it easy to attach the paper actuation mechanisms all over and run the wiring throughout. Our structure also needs to be easy to disassemble so that we can move it to the final gallery show. Once our structure is built and we have a working actuation design and sensing system, we will work to implement the actuation mechanism at scale.

A more detailed path analysis can be seen in the schedule linked below. Our B.O.M. and budget is linked below as well.

Schedule

B.O.M. and Budget

 

]]>