RCP Work https://courses.ideate.cmu.edu/16-375/f2020/work Robotics for Creative Practice: Student Work Tue, 15 Dec 2020 01:34:46 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.14 Final Documentation: Haptic Mobility https://courses.ideate.cmu.edu/16-375/f2020/work/2020/12/14/final-documentation-haptic-mobility/ Tue, 15 Dec 2020 00:56:47 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1556 Joey Wood & Sebastian Carpenter

Statement of Narrative:

Haptic Mobility is a shared tactile experience. Two participants take remote control of a small bug-like robot with limited mobility. Through their controllers, the participants can feel the the forces acting on the robot’s legs. By virtue of this feedback mechanism, users co-inhabit the body of the machine as they struggle to crawl up and down obstacles in the environment.

Reflection

This project strayed a bit from the explicit framework of using autonomous behavior as a narrative medium; it instead drifted into the territory of using a definitively non-autonomous robot in order to create an experience of shared control and a telematic extension of the users’ bodies. Rather than being a piece of art that is merely observed, Haptic Mobility as an installation requires participation in order to be fully appreciated. Like all remote-controlled vessels, this robot extends the users’ range of mobility in a way completely divorced from their physical body. Where this experience becomes somewhat novel, though, is in the direct physical feedback returned to the control joysticks and in the piece’s encouragement of collaborative control. Users can not only feel what the robot feels, turning the robot into a true extension of their bodies, but with each user taking control of only one leg with their joystick, they begin needing to communicate and developing a shared rhythm in order to make the robot follow any sort of predictable path. 

Outcomes

Despite being separated by mid-November, we managed to get a robot that functioned. It was able to drive and be controlled in the manner we intended, though not necessarily to the degree of fidelity we had hoped. A number of aesthetic sacrifices were made in the interest of time and resources (notably the lack of a housing for the controller and the slightly ill-fitting shells that necessitated fastening with tape rather than a proper housing), and the robot likely would have benefitted from more tuning on the controls side to improve the experience. Another casualty of our logistic constraints was our ability to do an actual physical installation that other people could experience. If time allows over the next couple months, we may revisit this project for the purposes of making improvements or setting up an actual installation. 

Video

Technical Documentation

CAD Rendering of mobile robot.
CAD Rendering of controller (concept, controller never completed)

Mechanical CAD & Software Files:

https://drive.google.com/drive/folders/1FTkAokVrruO1IYQU8XG63WgJ1ay5KRsM?usp=sharing

Contributions

Joey: Mechanical design of robot and robot drive mechanism, fabrication (machining, lasercutting, 3D printing) of mobile robot, electronics architecture & assembly, software development and testing of bilateral teleoperation control.

Sebastian: design and fabrication (making plaster form and vacuum forming plastic) of outer shells for robot, 3D-printing robot legs in nylon, designing controller housing (ultimately never fabricated due to time and logistical constraints), creating renderings of robot and controller, editing final video.

Citations

[1] Katz, Benjamin (2018). A Low Cost Modular Actuator for Dynamic Robots. Massachusetts Institute of Technology, Cambridge, Massachusetts.

[2] K. Goldberg, J. Santarromana, G. Bekey, S. Gentner, R.M.C Sutter, J. Wiegley (1995), The Telegarden, University of Southern California, Los Angeles, California.

]]>
Empathy Machine Final Report https://courses.ideate.cmu.edu/16-375/f2020/work/2020/12/13/empathy-machine-final-report/ Sun, 13 Dec 2020 23:43:35 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1548 By Jordi Long and Zhecui Zhang (Doris)

Concept:

The specific feeling that we are trying to evoke is inter-species compassion. We are attempting to create empathy in the viewer toward the robot in different scenarios.

Performance Objective:

The robot needs the human’s help to fulfill its own purpose. The mannerisms and actions of the robot when presented with different challenges in the scene will hopefully make the viewer want to help and protect the robot.

Design:

Since we are trying to create a helpless robot that needs human’s attention and help, we decided to made the robot with a sad and cute facial expression. The self balancing robot car kit we purchased online features clumsy movements, which is in line with our design concept – a robot that is not super capable of doing things. In order to make it easy for human to help the robot to stand up, a headphone is added as a handle for people to grab. There are three degrees of freedom for dynamic movements, it is made possible by wiggle eyes; arm that rotates and self balancing robot car.

Fabrication

Coding

  • Arduino IDE: Code from robot developer
  • Python Remote Control Interface: allows us to
    • Alter gains of PID controller of robot for balance
    • Drive robot like a remote control using keyboard
    • Act out different scenarios without writing hundreds of lines of code
  • The remote control aspect of our robot gives it a sense of personality, as it makes mistakes and falls over often.

Scenarios

  1. Bump into foam blocks
  2. Bump into futon
  3. Walk around a plant
  4. Walk through a stool
  5. Transition between timber flooring and carpet

Evaluation

It was successful with the two person we tested on, however, they are both graduate students from the ETC programme, which means they are familiar with robot human interaction and interactive devices in general. One of them, the male participant worked on a much more advanced robot project with a Boston Dynamics’s robot, that is probably why he is so responsive to the robot we built. (See the video documentation) Therefore, we should get people from different age groups and diverse background to react to it.

Through the human-robot interaction in the documentation we can see that it would be more interesting if the robot is more responsive, for example, the robot moves its head while the person says hi or stairs at the participant while moving around to attract more attention.

Citations

  • https://ieeexplore.ieee.org/abstract/document/677408?casa_token=l-8PTx_sTGMAAAAA:dNA-X7wK_4fqobxPQ1q8Jk15pSa9X_8NA8uiwvIKJgyT9p-ANHi_9M8Ctd2bB_ZoJtWJFGP1
  • http://simonpenny.net/works/petitmal.html
  • https://www.youtube.com/watch?v=ntlI-pDUxPE
  • https://www.youtube.com/watch?v=_wc9AJ5FuWY
  • https://www.youtube.com/watch?v=UIipbi0cAVE

Contribution

Jordi: Coding + Driving + Video Editing

Doris: Design + Fabrication + Video Recording

Documentation

***Our video documentation is super long because the male participant was super responsive to the robot and the interaction is super interesting that we did not want to cut the interaction and interview short.

***The documentation for the male participant is very blurry because we forgot to check the focus of the camera before recording and we think it would not be authentic if we re-record it.

]]>
Piano Tutor – Final https://courses.ideate.cmu.edu/16-375/f2020/work/2020/12/11/piano-tutor-final/ Sat, 12 Dec 2020 00:44:41 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1542 Final Product – Simulation

Final Product – Real

PROJECT FILE:

PROJECT STATEMENT:

We set out to create a performance that told the story of the mentor-mentee dynamic. We wanted to use two robots for the performance, and have them act out a piano lesson between a student and teacher. We focused on the play-and-repeat form of a piano lesson. We wanted to show emotions, such as frustration of the tutor when the student plays the wrong key, or the hesitation of an unsure student. We also wanted to create simple robots, for the purpose of putting the emphasis on the performance, and not the technical abilities of the robots.

REFLECTION ON COURSE THEME:

The central focus of this project revolves around the question of “how do we perceive machines as characters?” and “What is the minimum requirement for a machine to be perceived as having personality?”. In the case of this project, both the piano tutor and student are deprived of any sort of anthropomorphization; they are faceless machines with limited capabilities, and thus any form of character must be derived from their actions and circumstance. In terms of this project specifically, it is the call-and-response dynamic between the two robots that determines their character. The Students response can be categorized as obedient or disobedient, and the Tutors responses can vary from angrily slamming on keys to calmly playing the next note.
The purpose of the project is not so much to tell a story so much as it is to showcase a relationship. As a passive viewer, the observer gets to watch a dynamic unfold, one that they themselves might be familiar with, and characterize each robot from their own personal experiences and recognition of common themes.

OUTCOMES:

The physical project itself has flaws in its performance, each robot does not always hit the key it is programmed to hit, and the calibration strays the longer the program is run. However, in the times that it performs actions correctly, the performance on a whole gives the audience some sense of the tutor-student dynamic. The choice to make the performance somewhat autonomous and randomized subtracts from the performance as a whole, as it makes the actions of both robots confusing if there is not a set pattern established and the viewers do not know what to expect from either performer. As a learning experience, while the idea and vision of the project was clear, actually creating the machine from scratch made many of the details hard to implement, and as a result, some of the artistic vision was lost. Overall, simplifying certain ideas and concepts was useful in getting the machine to run, but should have been better compensated for to restore the artistry of the piece.

]]>
Piano Tutor Simulation Video https://courses.ideate.cmu.edu/16-375/f2020/work/2020/11/12/piano-tutor-animation-compilation/ Thu, 12 Nov 2020 05:13:59 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1533 Final simulation video for piano tutor

]]>
Piano Tutor sim 4.1 https://courses.ideate.cmu.edu/16-375/f2020/work/2020/11/01/piano-tutor-sim-4-1/ Mon, 02 Nov 2020 02:27:50 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1530 Updated simulator for Piano Tutor.
This version of simulation more accurately represents how the final product will be structured.

]]>
Shared Tactility Phase 3 Simulation https://courses.ideate.cmu.edu/16-375/f2020/work/2020/10/29/shared-tactility-phase-3-simulation/ Thu, 29 Oct 2020 16:46:05 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1525 This week we put together some webots simulations of the robot. Each leg is being manually controlled by keyboard input.

Also, I spent a little time in the shop this week making a leg prototype. Not an extensive prototype, just something to get a feel for the aesthetic, scale, and stiffness of the robot leg.

]]>
Joey & Sebastian Shared Tactility Conceptual Statement https://courses.ideate.cmu.edu/16-375/f2020/work/2020/10/20/joey-sebastian-shared-tactility-conceptual-statement/ Tue, 20 Oct 2020 17:35:41 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1514
  • Conceptual Statement
  • Our piece invites two viewers to work together to teleoperate a helpless two-legged crawling robot traversing a children’s sandbox. Together they inhabit its body; each controlling a single one of the robot’s legs, it can only march forward if the two operators synchronize their motions. Each operator’s controller communicates what forces the robot is feeling back to the operator. If the leg is dragging through the viscous sand, the operator will feel the resistance. If the leg is swinging freely through the air, the operators will feel that too. If the two operators are successful in interpreting these motions and forces, they will create a single, unified nervous system for this little machine.

    2. Additional Notes

    Sand environment:

    We believe the choice of a sandbox for the robot’s environment will create a rich sense of tactile feedback for each operator. As the robot goes through its gait cycle, the operator will feel a range of forces, from free low-resistance swinging through the air, to hard “slapping” contacts against the ground, to slow, viscous high-resistance dragging through the sand.

    3. Concept Sketches

    Several Robot forms rendered in SOLIDWORKS
    different controller designs based on common interactions. From left to right: controller based on an analog control knob, controller based on the winding mechanism of a fishing rod, ground-mounted controller based on the cranks found on machining equipment


    Two people driving the robot up a hill in a sandbox.
    Note that the handle on one of the controllers should be red, not blue.

    4. Technical Resources

    For this project there is a wealth of resources on hobbyist sites for taking cheap widely available BLDC (Brushless DC) motors and adding high-resolution encoding and current-sensing in order to do the force control we want.

    Motors

    Due to time constraints we are limited to what is quickly available on amazon/eBay. Electric skateboard motors meet the low speed and high torque requirements we want, and are widely available. We would like to start with a motor that has an abundance of torque, so that we never run into stalling issues. For prototypes like this it is always easier to lower the torque in software than it is to buy all new motors with extra torque. One candidate is shown below: (Note: the motor as is is sensorless so we would need to couple it to an encoder to make it work).

    Motor Control:

    https://odriverobotics.com/ has a suitable motor controller that is shipped from the U.S. and provides force control features. Also Joey is familiar with using it, and the manufacturer has good documentation and online support.

    Tying it all together:

    The odrive board has a very low amount of overhead, so the only other components we would need is a laptop (communicating with each board over USB), or a microcontroller like a Teensy or Arduino (communicating over UART).



    5. Pandemic Contingency Plan

    If the CMU machine shops close, we can focus on building the joysticks in hardware, using largely 3d printed components, and creating a simulation environment to control the robot in.

    ]]>
    Doris and Jordi Project Conceptual Statement https://courses.ideate.cmu.edu/16-375/f2020/work/2020/10/20/doris-and-jordi-project-conceptual-statement/ Tue, 20 Oct 2020 15:18:05 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1505 The specific feeling that we are trying to evoke is inter-species compassion. We are attempting to create empathy in the viewer toward the robot in the scene. The mannerisms and actions of the robot when presented with different challenges in the scene will hopefully make the viewer want to help the robot. Our robot will be a four-legged, animal-like robot with motors that spin the legs around to create forward motion. We will also add “cute”, animal features to the robot in an attempt to make the robot more adorable.

    For the body of this robot, we will use either wood, acrylic, or aluminum so that it can be shaped fairly easily by basic machining processes. This way, we can add the aforementioned animal-like features easily as well.

    The obstacle that this robot will need to overcome is a ramp/staircase that the robot needs to climb. The robot will easily be able to overcome the ramp, but eventually, the robot will become impatient, and attempt to climb the staircase instead. This will cause the robot to flip onto its back. This will hopefully entice the human to step in and flip the robot back over and guide it back up the ramp. The robot will then continue up the ramp and summit the course.

    For the hardware and software, the legs will be attached to the machined body that we discussed before. They will consist of four motors attached to curved legs, as seen in the video. We are not sure of the exact software that would be needed for programming these motors, but we imagine it would not be too complicated.

    If we are unable to meet in class, in person, due to the pandemic, both of us are located in Pittsburgh currently so we would be able to meet up and construct/program the robot together. We would not be able to machine the body so alternate, less intricate measures would need to be explored.

    ]]>
    Project Direction: Piano https://courses.ideate.cmu.edu/16-375/f2020/work/2020/10/19/project-direction-piano/ Mon, 19 Oct 2020 04:26:04 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1496 This piece focuses on representing the dynamics of a mentor/mentee relationship using simple robots performing a simple task: playing the piano. The performance is a series of call-and-response piano playing: the mentor plays a note and is expected to have the same note played back by the student. However, not every note is perfect, and tensions start to rise when the student plays wrong notes (accidentally, and also intentionally). Here we explore a sort of rivalry that appears between a strict teacher and a rebellious student.

    The focus of this piece is on robot behavior. The robots themselves have simple design so as to bring more attention to their actions in the context of the piece. Making the robots strike keys slowly, quickly, and in varying patterns gives a sense of character to each machine, and in framing an entire performance we hope to be able to convey a multitude of emotional responses from both robots.

    Our current direction will take the form below:

    The piston allows for more dramatic changes in key pushing speeds than a rotating joint and is less complicated than a slider joint.

    The robot will either slide along a rail to each key, or will drive freely with wheels.

    In terms of what we are going to use to build this, the only special mechanisms we will need are pistons and a rail system or wheels. Other pieces will be simple beams, either wood or steel.

    As a back up plan, we will revert back to the simulation and make the simulation as realistic and life-like as possible, as well as work on the audio effects of the simulation.

    ]]>
    Project Direction: Tactile Worlds https://courses.ideate.cmu.edu/16-375/f2020/work/2020/10/14/project-direction-tactile-worlds/ Thu, 15 Oct 2020 02:25:52 +0000 https://courses.ideate.cmu.edu/16-375/f2020/work/?p=1486 General Concept and Direction

    We settled on pursuing the tactile worlds project, but significantly changing the form, while keeping the basic interactions the same.

    The project still consist of two joysticks (now technically cranks rather than joysticks), but now each user is controlling one leg of a two legged robot that navigates a sandbox. The users are given free range to play in the sandbox with the robot, but they need to coordinate and communicate to achieve something with the robot.

    Each crank provides physical feedback by way of a motor in the controller so the user is able to feel the forces being exerted on their respective leg. This gives each user the ability to feel the world the robot is interacting with, and to give them a tactile experience in addition to the experience of collaborating to make the robot move.

    A potential design for the form of the robot. Each leg is controlled by a different person with a different controller.
    A potential design for the form of the controller. The crank in the controller directly corresponds to the movement of the motor driving the leg of the same color.
    Two people driving the robot up a hill in a sandbox.
    Note that the handle on one of the controllers should be red, not blue.

    Robot and Joystick Forms

    We experimented a bit with different potential forms for the robot bodies and legs, and for the controllers.

    We came up with a couple different body forms that might interact differently with the sand as the legs drag the body through the sand. We also experimented with different leg shapes that might interact with the sand in different ways with regards to grip and motion.

    A couple body prototypes with different leg attachments

    For controllers, we looked at a couple different commonplace interactions people have with rotational controllers. We ended up coming up with three different reference points: analog knobs found in things like sound systems, the winding mechanism in a fishing rod, and the large, floor-mounted crank found on machining equipment.

    different controller designs based on common interactions. From left to right: controller based on an analog control knob, controller based on the winding mechanism of a fishing rod, ground-mounted controller based on the cranks found on machining equipment

    ]]>