Uncategorized – Robotics for Creative Practice https://courses.ideate.cmu.edu/16-375/f2022 Surprisingly animate machines. Mon, 12 Dec 2022 04:35:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.0.8 Dinner Partner – Final Report – Matt & Karen https://courses.ideate.cmu.edu/16-375/f2022/2022/12/11/dinner-partner-final-report-matt-karen/ Mon, 12 Dec 2022 04:34:59 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3191

Project Narrative

Our performance consists of a person who used their knowledge in robotics to create a structure that fills a void from loneliness and their inability to sit alone with their own thoughts as they share a meal together. The objective of our performance was for the audience to see how the performer is occupied by and content with their fictitious partner. Their reliance on each other builds sympathy for the performer as the audience is able to recognize how lonely they are and how the performer would rather eat with an invisible person than eating alone. We were trying to get the audience to reflect on their own lives, and what would they do if they were truly alone. Would they be able to sit with their own thoughts? Or are distractions such as TV, social media, or family members essential to occupy them during an otherwise lonely experience? People who struggle with mental health or stress may find distractions comforting in lonely times and it makes the audience question what else is going on in the performer’s life that they cannot see.

Relationship to Course Themes

Our project aimed to connect with the course themes of using autonomy in a narrative medium and exploring the relationship between a human and robot in a hybrid performance. In terms of autonomy, we programmed the behavior of the fork and knife to be completely autonomous. Based off of a random value, the fork and knife moved, just the fork moved or just the knife moved at a constant interval. This autonomous movement allows the show to go on and eliminate downtime between elements that the performer actuates. The robot lifting the chalice and rotating the Lazy Suzan were all actuated by foot pedals that the performer would press under the tablecloth. This created a sense of mystery as it was not always clear as to what was automated and what was not.

The robot and the human performer interacted by ‘passing food’ via the Lazy Susan, “Cheers”ing each other’s chalice and by the performer smiling to the invisible person across the table showing the comfort that the robot brought to them.

By using motors and the clear acrylic as part of the illusion, we aimed to convert simple robotic movements into meaningful personifications of using utensils and drinking out of a chalice.

Outcomes

Mechanical Design and Assembly:

To create illusion by hiding the electrical components required to create our robot, we designed a false tabletop that everything sat below. Once we determined which table we were using for the performance, we created a CAD model assembly of the entire tabletop and Lazy Susan. This was effective in helping us visualize the scene and allowed us to catch mistakes in our design before fabricating. Various geometric and trigonometric calculations were required to optimize the size of the cut outs for the acrylic arm attachments.

Final CAD assembly of False Tabletop and Lazy Susan

We laser cut the entire assembly, excluding the Lazy Susan, out of 6mm plywood, and assembled with wood glue. This was effective in making our tabletop sturdy enough to support the food and decorations we wanted to place on top for the performance.

The chucks that held the fork and knife were created and 3D printed to have a tight fit to the irregular geometry of the fork and knife ends. This worked well as we did not need to glue the ends of the utensils inside of the chuck as the tolerance was small enough to hold them in with friction. The chalice holder was designed with greater tolerances, which allowed the chalice to actually tip at the top arc of its motion. This was an unintended consequence of the looser fit, but the outcome looked more realistic and further animated the robotic behavior.

From left to right: Knife Chuck, Fork Chuck and Chalice Holder CAD images, all 3D printed

Electrical Design and Control System:

We used a RaspberryPi Pico as our microcontroller to control the motors. We used two DC gear motors for moving the fork and knife, Matt had a strong Worm Gear motor from a previous project that we reused for lifting the glass, and we used a DC stepper motor for turning the Lazy Susan. The pico used motor drivers in order to control the speed and direction of the motors’ rotations, as well as how long they rotated (or how many steps in the case of the stepper motor). The code randomly chooses whether to move the fork, knife, or both. It moves the utensil(s) back and forth across the plate a few times, then pauses for a few seconds in a loop. Two foot pedals act as switches in order to trigger movement from the glass or the Lazy Susan. This movement is pre-programmed, and the pedals act as a switch in order to trigger the movement. One foot pedal causes the glass to be raised, as if someone is drinking from it, while the other foot pedal causes the Lazy Susan to turn half of a rotation. A 12V and 5V wall jack were used to power the motors and the pico.

Performance:

To set the stage, we used a tablecloth to cover the false tabletop. This hid everything and was effective in polishing the final look of our project. Many audience members did not even see the foot pedals under the table. The table was set with plates, napkins, salt & pepper, candles and real food that the performer was eating/ drinking during the performance. This, in addition to performing without speaking to the robot, upheld the emersion into the scene.

During the performance, there were no unexpected errors or problems. The robot ran exactly as programmed and the robot held up for the entire performance. We believe that the level of polish our assembly helped convey the messages we were trying to get across. While it was quite obvious to many audience members that the performer was lonely, the idea of not being able to sit with our thoughts alone did not get across until it was explained. We think that if we were to have more time, we could better convey this point my making the robot break mid performance, and the performer goes crazy trying to fix it or gets angry because he cannot handle not having his partner while he eats. While it was difficult to stay in character for the full performance, it was rewarding to see people’s reactions to our performance and we were happy that people enjoyed the show.

Performance Video

Video of Performance and each component of our robot
Another video shot by Professor Garth Zeglin of our Performance

Citations

Example code from the Creative Kinetic Systems website was used as a starting point for our code: https://courses.ideate.cmu.edu/16-223/f2022/text/code/index.html

The Lazy Susan and the associated CAD files were created by Professor Garth Zeglin

Source Code

# DRV8833 carrier board: https://www.pololu.com/product/2130

################################################################
# CircuitPython module documentation:
# time    https://circuitpython.readthedocs.io/en/latest/shared-bindings/time/index.html
# math    https://circuitpython.readthedocs.io/en/latest/shared-bindings/math/index.html
# board   https://circuitpython.readthedocs.io/en/latest/shared-bindings/board/index.html
# pwmio   https://circuitpython.readthedocs.io/en/latest/shared-bindings/pwmio/index.html

################################################################################
# print a banner as reminder of what code is loaded
print("Starting script.")

# load standard Python modules
import math, time

# load the CircuitPython hardware definition module for pin definitions
import board

# load the CircuitPython pulse-width-modulation module for driving hardware
import pwmio

import time
from digitalio import DigitalInOut, Direction, Pull
import digitalio
import random as r

#--------------------------------------------------------------------------------
# Class to represent a single dual H-bridge driver.

class DRV8833():
    def __init__(self, AIN1=board.GP18, AIN2=board.GP19, BIN2=board.GP20, BIN1=board.GP21, pwm_rate=20000):
        # Create a pair of PWMOut objects for each motor channel.
        self.ain1 = pwmio.PWMOut(AIN1, duty_cycle=0, frequency=pwm_rate)
        self.ain2 = pwmio.PWMOut(AIN2, duty_cycle=0, frequency=pwm_rate)

        self.bin1 = pwmio.PWMOut(BIN1, duty_cycle=0, frequency=pwm_rate)
        self.bin2 = pwmio.PWMOut(BIN2, duty_cycle=0, frequency=pwm_rate)

    def write(self, channel, rate):
        """Set the speed and direction on a single motor channel.

        :param channel:  0 for motor A, 1 for motor B
        :param rate: modulation value between -1.0 and 1.0, full reverse to full forward."""

        # convert the rate into a 16-bit fixed point integer
        pwm = min(max(int(2**16 * abs(rate)), 0), 65535)

        if channel == 0:
            if rate < 0:
                self.ain1.duty_cycle = pwm
                self.ain2.duty_cycle = 0
            else:
                self.ain1.duty_cycle = 0
                self.ain2.duty_cycle = pwm
        else:
            if rate < 0:
                self.bin1.duty_cycle = pwm
                self.bin2.duty_cycle = 0
            else:
                self.bin1.duty_cycle = 0
                self.bin2.duty_cycle = pwm


#--------------------------------------------------------------------------------
# Create an object to represent a dual motor driver.
print("Creating driver object.")
driver = DRV8833()


class A4988:
    def __init__(self, DIR=board.GP16, STEP=board.GP17):
        """This class represents an A4988 stepper motor driver.  It uses two output pins
        for direction and step control signals."""

        self._dir  = digitalio.DigitalInOut(DIR)
        self._step = digitalio.DigitalInOut(STEP)

        self._dir.direction  = digitalio.Direction.OUTPUT
        self._step.direction = digitalio.Direction.OUTPUT

        self._dir.value = False
        self._step.value = False

    def step(self, forward=True):
        """Emit one step pulse, with an optional direction flag."""
        self._dir.value = forward

        # Create a short pulse on the step pin.  Note that CircuitPython is slow
        # enough that normal execution delay is sufficient without actually
        # sleeping.
        self._step.value = True
        # time.sleep(1e-6)
        self._step.value = False

    def move_sync(self, steps, speed=1000.0):
        """Move the stepper motor the signed number of steps forward or backward at the
        speed specified in steps per second.  N.B. this function will not return
        until the move is done, so it is not compatible with asynchronous event
        loops.
        """


        self._dir.value = (steps &gt;= 0)
        #print(self._dir.value)
        time_per_step = 1.0 / speed
        #print(time_per_step)
        for count in range(abs(steps)):
            self._step.value = True
            # time.sleep(1e-6)
            self._step.value = False
            time.sleep(time_per_step)

    def deinit(self):
        """Manage resource release as part of object lifecycle."""
        self._dir.deinit()
        self._step.deinit()
        self._dir  = None
        self._step = None

    def __enter__(self):
        return self

    def __exit__(self):
        # Automatically deinitializes the hardware when exiting a context.
        self.deinit()

#--------------------------------------------------------------------------------
# Stepper motor demonstration.

def turn_half():
    speed = 500
    stepper.move_sync(1725, speed)

def turn_third():
    speed = 500
    stepper.move_sync(575, speed)



#--------------------------------------------------------------------------------
# Begin the main processing loop.  This is structured as a looping script, since
# each movement primitive 'blocks', i.e. doesn't return until the action is
# finished.

def move_knife():
    knifeUp = 0.8
    knifeDown = 0.8
    sleepUp = 0.15
    sleepDown = 0.15

    driver.write(0, knifeUp)
    time.sleep(sleepUp)

    driver.write(0, 0.55)
    time.sleep(0.2)

    driver.write(0, -1*knifeDown)
    time.sleep(sleepDown)

    driver.write(0, 0.55)
    time.sleep(0.2)

def move_fork():
    speedUp = 0.8
    speedDown = 0.75
    sleepUp = 0.13
    sleepDown = 0.13

    driver.write(1, speedUp)
    time.sleep(sleepUp)

    driver.write(1, 0.3)
    time.sleep(0.2)

    driver.write(1, -1*speedDown)
    time.sleep(sleepDown)

    driver.write(1, 0.4)
    time.sleep(0.2)
def move_fork_and_knife():
    knifeUp = 0.8
    knifeDown = 0.8
    forkUp = 0.8
    forkDown = 0.85
    sleepUp = 0.13
    sleepDown = 0.13

    driver.write(0, knifeUp)
    driver.write(1, forkUp)
    time.sleep(sleepUp)

    driver.write(0, 0.55)
    driver.write(1, 0.3)
    time.sleep(0.2)

    driver.write(0, -1*knifeDown)
    driver.write(1, -1*forkDown)
    time.sleep(sleepDown)

    driver.write(0, 0.55)
    driver.write(1, 0.55)
    time.sleep(0.2)


#   Pico pin 4, GPI34  -&gt; PMW
#   Pico pin 5, GPIO3  -&gt; IN_A
#   Pico pin 6, GPIO4  -&gt; IN_B
IN_A =DigitalInOut(board.GP3)
IN_A.direction = Direction.OUTPUT
IN_B =DigitalInOut(board.GP4)
IN_B.direction = Direction.OUTPUT
PMW = pwmio.PWMOut(board.A2)

def motor_stop():
    IN_A.value = False
    IN_B.value = False
    PMW.duty_cycle = 0
arm_pmw =  20000
def motor_cw():
    IN_A.value = True
    IN_B.value = False
    PMW.duty_cycle = arm_pmw

def motor_ccw():
    IN_A.value = False
    IN_B.value = True
    PMW.duty_cycle = arm_pmw


led = DigitalInOut(board.LED)  # GP25
led.direction = Direction.OUTPUT


switch = DigitalInOut(board.GP15)
switch.direction = Direction.INPUT
switch2 = DigitalInOut(board.GP14)
switch2.direction = Direction.INPUT

time.sleep(3.0)
print("Starting main script.")
i = 0
j = 0
state_index = False
state_index2 = False
stepper = A4988()
fork_knife_move = r.randint(0,2)
print(fork_knife_move)
while True:
    if state_index is False:
        if switch.value is True:
            state_index = True
            print("susan On")
            turn_half()


    elif state_index is True:
        if switch.value is False:
            state_index = False
            print("susan Off")



    if state_index2 is False:
        if switch2.value is True:
            state_index2 = True
            print("arm On")
            motor_ccw()
            time.sleep(1)
            motor_stop()
            time.sleep(0.6)
            motor_cw()
            time.sleep(0.66)
            motor_stop()


    elif state_index2 is True:
        if switch2.value is False:
            state_index2 = False
            print("arm Off")

    if i == 10 and j == 50:
        i = 0
        j=0
        fork_knife_move = r.randint(0,2)
    elif i == 10:
        time.sleep(0.1)
        j = j+1
    else:
        if fork_knife_move == 0:
            move_fork()
        elif fork_knife_move == 1:
            move_knife()
        else:
            move_fork_and_knife()
        i=i+1

Mechanical CAD Files

The file size of our CAD assembly was too big to upload to the Media Library. Please use the link below to find our FinalAssy.zip uploaded to Google Drive:

https://drive.google.com/file/d/16lPWVSlWRBp4n3A_TnQfvgPp6M4_eqgr/view?usp=share_link

Misc. CAD files that were separate from our tabletop assembly, including the fork and knife chucks, and the chalice components, here:

Photographs

Tabletop set with food, plates and other props used during the performance
Fork and Knife that are actuated by automated movements
Chalice being lifted up by the worm gear motor
Close up of the Lazy Suzan that can be actuated with a motor or by the performer’s hand
Final Set up before the performance

Project Contributions

Matt: Mechanical design of false table top, design of fork, knife and chalice holders, performance, tuning PID for Webots simulation

Karen: Electrical design, project code, debugging, motor mount designs, design of Webots simulation.

Both: Staging the scene – food table cloth, etc., developing the narrative and assembly

]]>
Project Outcome: Kate and Doris https://courses.ideate.cmu.edu/16-375/f2022/2022/12/10/project-outcome-kate-and-doris/ Sat, 10 Dec 2022 15:07:03 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3152 The Dissatisfied Artist

by Kate and Doris

Project Introduction:

The initial project description read as follows:

The overall idea is to create a drawing robot that makes drawings then crumples up the drawing after being unsatisfied with the resulting drawing. There would be a human performer element to the piece where the human performer walks up to the robot, un-crumples the drawings, and then displays them as “awesome robot art” and replaces the paper for the robot to try again. (trying to be supportive of the robot) The actual drawings will consist of a set order of drawings that create a narrative of the robot trying to draw better and eventually creating a drawing it doesn’t crumple up. The audience would be able to come up to the piece at any point in the process and see the existing list of drawings and the current drawing the robot is working on to get the idea of progress without having to see the whole performance. Of course, the final drawing the robot does not crush would only be visible to the people there at the end.

Reflection:

Drawing robots are nothing new or especially unique in any way, but when combined with a performer present and some added automated behaviors, the robot suddenly takes on more of a life of its own. The piece was entirely automated with pauses coded in to allow the human performer to interact with it and change the paper so it can draw the next thing. With enough work, the entire piece could have been automated to replace it’s own paper and get rid of the crushed paper, but a significant part of the narrative would be lost, as the focus of the piece is trying to create a “relatable robot” where observers could see a piece of themselves reflected in an automated robot.

How Things Changed:

For the most part, the show remained the same as what we initially intended, except we themed the entire show to be a “drawing class for robots” and the robot that is drawing was our student we were teaching.

Observations:

Kate:

The most interesting observations during the show were how people almost entirely gendered the robot as “he” during the show. There were a few people who used “she” but nobody referred to the robot as an “it” or “they.” I found this incredibly interesting as it not only shows that the audience related enough to the robot to refer to it as a person, but also because everyone seemed to assume it was a “he.” I am not entirely sure the reason this happened, but it was interesting to observe. Another observation I made is that the people who spend the most time observing the piece were most interested in trying to figure out what the robot was drawing almost like a game show where the drawing becomes more clear as time goes on. The other reaction to the piece was that people felt bad for the robot and related to the struggle of trying to make art and not being satisfied with the work. I also really enjoyed interacting with people “in-character” for the piece and giving away the robot’s drawings to people who wanted them.

Doris:

During the show I kind of thought of the robot as a very intelligent groundhog somehow. Maybe because of the brown hair ball and the very notable presence of teeth. I also feel like there’s also other opportunities of making it a directly-interactive-with-the-audience work – e.g. whenever the audience compliments the robot for its drawing, (one of the performers would control the robot and) the robot would be difficult and feel like its not good enough and crumple up the drawing; and then if the audience express something about the drawing being very slow or very meticulous, (the performer controlling the robot would decide that) the robot feels glad that effort is being noticed and would keep the drawing without crumpling it up.

Photos:

The full installation
Close up of just the robot
Top view of the table
One of the “hands” of the robot that did the crushing (not really visible in the video)

Video of Project:

A brief view of what the piece looked like when running.

Presentation:

https://docs.google.com/presentation/d/1NRUFueCWiSbjRdvitCi8xOdrFIdTJTflioAGe-L_EnY/edit#slide=id.p

Cad and Code:

Contributions:

We worked together as much as possible to develop the artistic aspects of the piece.

Kate: Python code for interpreting g-code for the robot arm to get it to draw and utilizing an already existing library to turn images into reasonable g-code, arduino code to run the motors at the correct times, and mechanical design of roller and enclosure for crushing mechanism.

Doris: foraged for pool noodle; made crude sketches for the crushing mechanism; created dxf files using Solidworks for the crush arm and gears with helps from Kate; iterated designs for crush arm and hand, also added hand box for the crushing mechanism to be viable (in an easier way).

]]>
Final Report—Richard + Janice https://courses.ideate.cmu.edu/16-375/f2022/2022/12/09/final-report-richard-janice/ Fri, 09 Dec 2022 18:55:43 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3182 Chewing Robot
Janice Lee, Richard Zhou

https://drive.google.com/file/d/1zxKbpeQ8urv4j4JKt3BOQpGrqWthjGm9/view?usp=sharing

I. Project Objective

Our project focused on the ethical questions surrounding biomimicry and the moral ambiguity of accurately giving animate qualities to objects that lack life. In our chewing robot, the viewer is confronted by the juxtaposition of a familiar sight of a person feeding another like a mother would to a child but instead of a human form, is built from anatomical parts and mechanical linkages. We wanted to heighten the sense of unease in the contradictions between what movements were expressed with their lack of fundamental purpose to the robot’s own needs.

II. Reflection

Throughout the semester, there were lots of themes covered in class such as the use of autonomous behavior as a narrative medium, the relation between human and machine in hybrid performance, and the animate possibilities of simple machines. 

Regarding the autonomous behavior of our chewing robot, our robot was controlled using two joysticks connected to an Arduino controller. One of the joysticks controlled the neck movement while the other determined the opening and closing of the jaw. Since the robot is controlled by a human operator, the performance of the robot was not entirely autonomous. However, by being able to control directly, the robot was able to make a variation in the movement and react differently in occurring situations. 

The performance of the robot was conducted with a human performer feeding the robot by offering chips, pretzels and crackers. During the performance, the human performer would offer the robot snacks, and the robot would respond by either nodding or shaking its head. The robot is questioned and it responds with motion. Both the human performer and the robot are essential to the performance and the relationship is well established where the human performer cares for the robot and the robot depends on the performer. 

Lastly, for the animate possibilities, we believe that our chewing robot is full of potential. Although our performance was mainly on imitating the chewing motion of humans, the robot is fully capable of imitating neck and jaw movements. Thus, with a variation in the combination of these movements, the robot will be able to capture other bio-inspired movements for artistic performances. 

Overall, the main goal of the project was to mimic the movement of humans chewing food, and with the robot, we were able to mimic the movements. 

III. Outcomes

Much of our initial conflict originated in deciding how to actuate the head to balance expressiveness with ease of control. Our initial concepts maximized the anatomical correctness of having four symmetrical linear actuations giving full control of all rotational axes with some secondary translational movement. However, the complexity of fabrication and control ultimately led to us taking a similar approach with two linear actuators in the front with a single, centered guide rod along the back. This way, we retained both tilt degrees of freedom and kept the sleek, neck-like visual lines across the front while only needing two motors to control.

We found the actual chewing motion much easier to deal with than the neck. We decided to mount the motors inside the cranial capacity itself so that the movement of the mandible was entirely independent from the orientation of the skull. We also made sure the closing of the mouth was powered rather than opening, so we could control the torque of how hard we wanted to close the jaw. While this was great in theory, we found during our performance that our motor did not have sufficient torque to truly crush the food inside the mouth, and relied more on the stronger, higher-torque motors in the neck to tear food rather than a simple biting motion. In future versions of the robot, we would benefit from upgrading the motor to support both options.

We found the actual performance aspect crucial to how the skull read to the viewers. We phrased each interaction as if we were interacting with a small child, giving yes-no questions that were easy for the controller to respond to in a natural, conversational manner. The back and forth helped set the context and give intuitive yet convincing ways of freestyling the performance.

IV. Citations

Initially, during the development and design step of the project, we were motivated to imitate the biological chewing process of the human jaw. So, in order to study the anatomy of the jaw motion, we referred to the following youtube video: 

Jaw open & close muscles (Mandible Elevation & Depression Muscles)

To develop this robot, we refer to external works of robots with similar functionality. Specifically for the neck movement of the robot, we referred to the mechanism of the Mesmer 2.0 robot. 

Mesmer 2.0: Interaction & Quick Release Head

Also, to check other pre-existing chewing robots, we studied the mechanism of a chewing robot developed by the Korean Food Institute. 

New ‘chewing robot’ helps develop softer food for the elderly

V. Technical Documentation

For the technical component of this project, our team wrote code using Arduino to run three servos motors, two for the neck movement and one for opening and closing the jaw. The code will be provided in Section 1 Program Source Code. Also, to improve the stability and the position of the skull, our robot has a sturdy base, which was specifically designed using a 3D model. The mechanical CAD file will also be attached in section 2. Lastly, some of the linkages and connections between the different components will be covered in section 3.

1. Program Source Code

The code revolves around driving three motors with two joysticks. The primary joystick drives the motions of the neck while the secondary joystick drives the chewing motion of the jaw. We read both axes of the primary joystick, with the vertical axis corresponding to the overall average tilt of both actuators, causing the head to tilt up and down. The horizontal axis corresponds to the difference in the two neck actuators, causing the head to tilt to the left and right. Each motor output is locked to the range of motion of the structure to prevent the operator from unintentionally destroying the frame. In addition, pressing the joystick all the way down resets the neck to the default position, which was often used when the head got in a strange position that was difficult to bring out of. The jaw joystick was much simpler, closing or opening the jaw depending on where the joystick was oriented. It was similar locked to the range of the jaw itself to prevent it from unnecessarily applying torque to our closing mechanism.

2. Mechanical CAD Files

The base component that holds up the skull was made by laser-cutting quarter inch clear acrylic. All the 3D Modeling were done using Solidworks 3D Modeling Software, and the assembly of the base is the following:

The original files for the CAD files are saved as ‘skull_model_base.zip.’ The same folder also includes .dxf files for the final laser-cut parts. 

3. Photographs of Specific Elements

VI. Individual Contributions

TaskJaniceRichard
Design – 3D Modeled draft_design_1
– Redesigned draft design 2 to improve the sturdiness
– 3D Modeled draft_design_2
– Designed and drafted initial design structure and dimensions (draft_design_1)
– Created board for overall visual direction of robot
– Designed linkage system for neck actuation and chewing mechanism
Mechanical Structure– Laser-cut both draft designs 
– Assembled/Attached the laser-cut parts 
– Assembled servo motors to the base and skull
– Assembled the linkages and mounts for neck movement
Electrical– Initial breadboarding for the servo motors with power supply
– Adjusted the joystick wiring for longer wire
– Finalized wiring for joystick and three neck motors for size and ease of control
– Debugged the servo motor power supply issue 
Programming– Initial Arduino sketch for controlling servo motors using a joystick– Revised the Arduino code to adjust for additional neck + jaw motors and two joystick inputs
– Revised Arduino code to improve ease of control of neck movement
]]>
Final Report – Rebecca & Lauren https://courses.ideate.cmu.edu/16-375/f2022/2022/12/08/final-report-rebecca-lauren/ Thu, 08 Dec 2022 16:34:03 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3148 Title of Work: Percussion Plant

Project Statement: The purpose of the percussion plant is to represent and simulate the relationship between humans and plants in a more lively and fast-paced manner. Normally the interaction between a plant and a person develops over months as the person gives the plant water, light, and care and the plant slowly grows and moves. The percussion plant responds immediately to sounds created by a variety of instruments by shaking and twisting. The leaves of the percussion plant are modeled off of the prayer plant’s leaves because a prayer plant responds to light by opening up toward it. The percussion plant is meant to act similarly in response to sound.

Relationship of Project to course themes: Our project utilized programmed autonomous behaviors that were played according to the choices that the performer made. The plant robot used three stepper motors that rotated a module with free spinning leaves on the ends. This simple motion created really interesting visual motion and a way for us to create a more animate object. This piece shows how you can utilize robotics and simple movements to put a completely different meaning into a piece. Our plant on its own was already visually interesting, but the motion allows viewers to see a relationship between our actions and plants actions. Just as the prayer plant responds to light and all plants responds to care, our robot plant responds to percussion.

Front view of our setup

Implementation: To implement our design, we made the branches angled so that the leaves rotate around different axes. We also lasercut wood to cut leaf frames because 3D printing them was time consuming. Attaching fabric and adding color to the leaf frames made it clear that our robot is a plant robot. We also thought about how we could make the audience to make a sound or noise. We came up with an idea that placing percussion toys under the robot would make them want to play with them to see what happens. For our project setup, we decided to make the booth walls angled to semi-hide the performers in the back.

Robot & Performer
Behind the booth (electronics, stepper motor mount)

Outcomes: The outcome of our project was an extremely visually interesting piece that was surprisingly interactive. At the show, viewer’s first instinct was to admire the plant and comment on the interesting textures. But once they experimented playing with one of the percussion toys, they found that it also moved. Viewers liked to experiment with different rhythms and instruments to try to find the pattern in the movement. As a performer it was fun to subvert this expectation of pattern by following a certain pattern when it came to consistent sounds but changing it up if it became repetitive. The intent was to make the plant to appear more alive than expected. Some things that could have been improved was the variety of movements from the plant. On show day, there was only a small number of behaviors programmed into the board for the performer to use. This could be expanded to make it even more responsive to different sounds. The way we programmed the plant motions also caused some unexpected behaviors. For example, if buttons were pressed too many times before the blocking movements completed the motions became very delayed. Thus, timing the right movement according to the sound was difficult. Although the program could be refined, the final show still produced pleasant reactions and behaviors from our robot.

Having a main performer and a side performer was also interesting. The main performer, Lauren, found a pattern in which movement to make for different sounds shortly after the show started. When the side performer, Rebecca, and the Lauren were together behind the booth, we both agreed on which movement best suits the sound. However, in one case, Lauren heard the sound as melodical whereas Rebecca thought it was chaotic and loud. Once Lauren said she is thinking about playing the constant movement because she hears a melodical sound, Rebecca could hear it too and agreed to make the robot move constantly.

The audience’s behavior before and after finding out the robot is controlled by a human performer was very different. After the audience found out we were controlling the robot behind the booth, some would play the instruments while looking at us. It was funny to see the audience realize that there are people behind the booth listening to them the whole time.

Performance Video:

Citations:

Stepper motor drive system: https://courses.ideate.cmu.edu/16-375/f2022/ref/text/hardware/cnc-shield.html#id9

Arduino Firmware: https://courses.ideate.cmu.edu/16-375/f2022/ref/text/code/StepperWinch.html#stepperwinch

Base python code edited for project: https://courses.ideate.cmu.edu/16-375/f2022/ref/text/code/suitcase-scripting.html#suitcase-midi-demo-py

Supporting technical documentation:

Lauren’s contributions: Created sewn leaves. Set up electronics and programmed the plant robot.

Rebecca’s contributions: Design and create wood and ABS leaf frames, and attach fabric. Design and lasercut branches and stepper motor mount.

]]>
Prototype Progress | Karen and Matt https://courses.ideate.cmu.edu/16-375/f2022/2022/11/29/prototype-progress-karen-and-matt/ Tue, 29 Nov 2022 20:08:36 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3145
]]>
Prototype Progress: Kate and Doris (as of 11/22) https://courses.ideate.cmu.edu/16-375/f2022/2022/11/29/prototype-progress-kate-and-doris-as-of-11-22/ Tue, 29 Nov 2022 19:14:55 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3140 Forgot to post! Also I(Doris)didn’t go to saturday work session so progress is slowed.

]]>
Prototype Progress – Lauren and Rebecca https://courses.ideate.cmu.edu/16-375/f2022/2022/11/22/prototype-progress-lauren-and-rebecca/ Tue, 22 Nov 2022 21:13:09 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3129 We are still working through our programming, so we don’t have any new movements currently although the midi board is working now.

]]>
Prototype | Karen & Matt https://courses.ideate.cmu.edu/16-375/f2022/2022/11/22/prototype-karen-matt/ Tue, 22 Nov 2022 19:20:30 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3127
]]>
Proof of Concept 2 | Karen and Matt https://courses.ideate.cmu.edu/16-375/f2022/2022/11/15/proof-of-concept-2-karen-and-matt/ Tue, 15 Nov 2022 20:08:05 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3123
]]>
Proof of Concept 2 – Lauren and Rebecca https://courses.ideate.cmu.edu/16-375/f2022/2022/11/15/proof-of-concept-2-lauren-and-rebecca/ Tue, 15 Nov 2022 20:03:38 +0000 https://courses.ideate.cmu.edu/16-375/f2022/?p=3118
]]>