sarikab@andrew.cmu.edu – Human-Machine Virtuosity https://courses.ideate.cmu.edu/16-455/s2018 An exploration of skilled human gesture and design, Spring 2018. Mon, 14 May 2018 07:53:05 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.25 Robot’s Cradle https://courses.ideate.cmu.edu/16-455/s2018/794/robots-cradle/ https://courses.ideate.cmu.edu/16-455/s2018/794/robots-cradle/#respond Sun, 13 May 2018 20:15:21 +0000 https://courses.ideate.cmu.edu/16-455/s2018/?p=794 Continue reading Robot’s Cradle ]]>

Stephanie Smid & Sarika Bajaj

Human Machine Virtuosity, Spring 2018

— — —

ABSTRACT

Robot’s Cradle was a project focused on creating a hybrid analog and digital design workflow for experimental weaving. Inspired by peg loom weaving or string art, our workflow involved transforming human drawing (tracked using motion capture software) into a weaving pattern using Grasshopper and Python that was then weaved by a robotic arm fitted with our custom string tool. By the end of our project, we were able to achieve the discreet stages of this workflow and start investigating the edges of this workflow, specifically what constitutes better use of the workflow and what is a “finished design” from the workflow.

OBJECTIVES

In order to actualize this workflow, we aimed to 1) reliably capture human drawing using the motion capture system 2) properly process the motion capture data to create a set of tangent line approximations 3) convert the tangent lines into a string path the robot could actually follow 4) have the robot use a custom tool to wind the given pattern onto a peg board. The ultimate objective of this process was to achieve a workflow such that designers could create finished weaving pieces from our system.

IMPLEMENTATION

In order to create this workflow, we began with setting up the motion capture system, using a graphite pencil outfitted with motion capture markers that would allow us to track its motion. After setting up some initial bounding conditions, we were able to pretty easily track the line the pencil was drawing, divide up the line into a set of points, and derive tangent lines from each of those points in Grasshopper.

We then transitioned over into working on the physical parts of the project, namely procuring string, creating a base peg board, and creating a robot tool that could hold a thread spool and dispense string. The string we settled on using was wax coated, which made the winding process a bit simpler as the thread itself was sturdier than normal and needed less tensioning for winding. We had to do several iterations of the peg board, experimenting with which peg shape would best encourage string to stay wound as well as what distance between pegs best kept the integrity of the design while not so close as to prevent winding. Finally, our robot tool was the most complex physical part of our system, where we had to iterate how to best keep the tension throughout the system (a problem we solved using a felt washer to tighten the turning of the spool) as well as combat the tension points around the edges of the string dispensing tube. After settling on final designs for the peg board and robot tool, we went back to write the Python code needed to turn our tangent lines into an actual robot string path that we then converted into HAL code for winding. After getting the robot to successfully wind, we iterated back and worked on making our entire system more connected into an actual workflow. The main change we made in this process was that our initial design involved using projection for user feedback that we replaced with a separate screen instead.

OUTCOMES

We were successful in creating a complete workflow for experimental weaving, which meant we were able to explore what a “finished” design might look like from our system. Thus, we had several users come in and test our system to see how they interacted with our system and what some of the better results might look like. We found that the “finished” designs often had areas of clear density through the system, such that the initial drawing was clearly visualized in the string, and users would often redo their initial drawing when the string pattern produced was too abstract and not as clearly visualized in string density. The best example of a finished piece from the workflow is visualized below, where several layers (each with distinct, clearly identifiable patterns) are layered upon each other to create a final piece.

CONTRIBUTION

Stephanie primarily worked on the Rhino and Grasshopper portion of the code, as well as generating the final HAL code for winding. Sarika focused on the Python string winding algorithm and the point generation for maneuvering around each peg. Both Stephanie and Sarika created iterations of the peg board as well as the robot tool that held the thread spool and dispensed thread.

The two of us would like to thank Professor Josh Bard and Professor Garth Zeglin for their mentoring and advice throughout this project. We would also like to thank Manuel Rodriguez for their assistance in debugging some of our Python and Grasshopper integration issues and Jett Vaultz for allowing us to borrow their graphite pencil.

]]>
https://courses.ideate.cmu.edu/16-455/s2018/794/robots-cradle/feed/ 0
Robot’s Cradle: Shortest Path Prototype https://courses.ideate.cmu.edu/16-455/s2018/645/robots-cradle-shortest-path-prototype/ https://courses.ideate.cmu.edu/16-455/s2018/645/robots-cradle-shortest-path-prototype/#respond Mon, 02 Apr 2018 03:08:58 +0000 https://courses.ideate.cmu.edu/16-455/s2018/?p=645 Continue reading Robot’s Cradle: Shortest Path Prototype ]]> Stephanie Smid and Sarika Bajaj

current progress | example artifacts | next steps

Our project involves creating a hybrid workflow that combines human drawing skill and robotic assembly for peg loom weaving. Through this workflow, artists should be able to draw patterns with specified string densities that a robot will solve for and manually string around a loom.

The specific processes involved in our workflow are detailed below and involves using motion capture data to capture drawing that is then processed using Grasshopper to finally RAPID code that will control a robot.

For our shortest path prototype, our goals included 1) properly tracking human drawing using the motion capture system 2) processing the motion capture data in Grasshopper to create a viable string pattern 3) using the viable string pattern to hand wind the pattern on our custom made peg board.

Since Stephanie’s first project had involved tracking human drawing using the motion capture system, we were able to use the same rig for this project and were able to get the pen tracked rather quickly. Moreover, after some initial tests where we experimented with different x, y, and z bounding box cut offs as well as time sampling, we were able to get a reasonably smooth curve from the motion capture system. 

Using Grasshopper, we were able to split up the motion capture curve into distinct points. We then drew tangent lines from each of those points that were then pulled to the closest approximate peg (in our case we had 13 per side). Our final step involved simply printing out the Grasshopper visualization and using the printout to hand wind the string around the pegs.

The artifacts from our shortest path prototype are illustrated below:

In terms of our next steps, the major next problem will be integrating the robot into our current workflow; instead of hand winding as we are right now, we will have to work to creating a system that will enable the robot to wind the string around the pegs. For the robot winding, we will also have do another iteration of our peg board, as we noticed our acrylic frame was not able to withstand the force of the string tension (often without a foam core base, the frame would warp); thus, we will probably create a sturdier wood iteration that will be able to handle the force. Moreover, we would need to do some further Grasshopper processing to identify the most viable path for the string to follow, as that is currently an approximation we are doing automatically as humans observing the grasshopper processing. Finally, we also need to figure out a better way to track the pencil in relation to the frame itself to ensure that our frame of reference for any drawing is correct; while in our tests, the frame of reference seemed to be reasonable, it is possible there was some offset between physical drawing and the virtual model.

]]>
https://courses.ideate.cmu.edu/16-455/s2018/645/robots-cradle-shortest-path-prototype/feed/ 0
Project Proposal: Robot’s Cradle https://courses.ideate.cmu.edu/16-455/s2018/577/project-proposal-robotic-weaving/ https://courses.ideate.cmu.edu/16-455/s2018/577/project-proposal-robotic-weaving/#respond Wed, 07 Mar 2018 00:34:14 +0000 https://courses.ideate.cmu.edu/16-455/s2018/?p=577 Continue reading Project Proposal: Robot’s Cradle ]]> Stephanie Smid and Sarika Bajaj

project context | motivation | scope | implementation

Our project involves creating a hybrid workflow that combines human drawing skill and robotic assembly for peg loom weaving. Through this workflow, artists should be able to draw patterns with specified string densities that a robot will solve for and manually string around a loom.

The inspiration for this project originally came from the fabrication processes of peg loom weaving and string art; the first is the base of most weaving techniques and involves weavers densely binding and patterning string across a row of pegs, and the latter is a more experimental result of peg loom weaving where artists focus more on creating patterns via the density of the string instead of the string itself. The closest case study for this project is the Silk Pavilion Project from the MIT Media Lab, where researchers used a CNC machine to manually attach string to a metal frame lined with pegs and then placed silkworms on top to fill in the structure.

Our spin on these experimental weaving techniques is to create a workflow where a person’s drawings gets processed and translated into a string density pattern that a robot weaves on a set of pegs. The person will first draw on the empty peg board, detailing gaps where no string is to be weaved and then detailing areas of higher density and patterns. The MOCAP set up in the design lab will track the stylus which the person is using to draw and will be used to create a real-time projection of the string patterning created by the drawing. Once the person is satisfied with the projected string pattern, the robot will then use string, covered in UV curing resin, to create the pattern on the peg board. The final step will simply involve using UV light to curve the final string shape and to remove the pattern from the peg board. The initial peg boards will be flat, two dimensional sheets; however, if we are able to successfully create this two dimensional workflow, then we will start transitioning to three dimensions using peg boards with slight curves and angles to further distort the string patterning. Below are early diagrams exploring how the system can work.

The following is a visual of the weaving system:

 

The following is an explicit workflow diagram detailing the process:

 

]]>
https://courses.ideate.cmu.edu/16-455/s2018/577/project-proposal-robotic-weaving/feed/ 0
Diabolo Skill Analysis https://courses.ideate.cmu.edu/16-455/s2018/501/diabolo-skill-analysis/ https://courses.ideate.cmu.edu/16-455/s2018/501/diabolo-skill-analysis/#respond Mon, 19 Feb 2018 05:53:52 +0000 https://courses.ideate.cmu.edu/16-455/s2018/?p=501 Continue reading Diabolo Skill Analysis ]]> Manuel Rodriguez Ladron de Guevara, Ariana Daly, and Sarika Bajaj
February 19, 2018

As an introduction to the motion capture system and skill analysis workflow, our team chose to investigate the skill of juggling a diabolo, an hourglass shaped circus prop that is balanced on a string connected to two sticks. After some initial investigation and testing, we found that the main skill behind the diabolo is balance, which is controlled in two ways: 1) by maintaining the speed of the diabolo via a windup technique 2) changing the offset between the two hands, which changes the tilt of the diabolo. In order to examine these movements further, we created a set of jigs that allowed us to track the diabolo’s position and spin as well as the position and angle of the two sticks. We then called in an expert diabolist, Benjamin Geyer, who we recorded using our diabolo set up. After analyzing his movements, we found that the diabolo wind up consists of several sharp tugs, that temporarily create a huge burst in velocity and gradually accelerates the diabolo. The offset hand positioning maintains balance in a linear fashion; as the offset in one direction changes, the diabolo tilts accordingly. Through this experimentation, we gained insight into the primary skills required in juggling a diabolo.

Project Objectives

The goal of this project was to explore the skill of diabolo juggling, using motion capture as a quantitative analysis tool. Specifically, we focused on identifying the mechanics of balancing a diabolo, in terms of creating enough rotational inertia as well as coordinating the offset between the hands for balance.

Creating the Tracking Jigs for the Diabolo

 

In order to track the diabolo using the motion capture system, we created a set of jigs that would attach the motion trackers to the relevant components. For the sticks, we created a set of acrylic holders with three markers each, arranged in a triangle formation such that the centroid was located on the axis of the sticks. For the diabolo, we created two cardboard circles with three markers, that we pushed into the two side hollow of the diabolo. The creation of these circular holders took some experimentation as our initial attempts either prevented the diabolo’s initial rotation or ruined the diabolo’s internal balance. However, these final press fit holders worked a suitable form factor for trials.

Motion Capture Process

To act as our diabolo expert, we recruited Mr. Benjamin Geyer, from the CMU club Masters of Flying Objects, who graciously volunteered his time to be recorded by our motion capture system. He performed four main tricks with the diabolo: 1) balancing the diabolo in place 2) double wrapping the string around the diabolo 3) changing the spin direction of the diabolo 4) throwing and catching the diabolo using the string. Additionally, for some additional testing and exploration, we captured some motion capture data of our own practices with the diabolo, specifically of the diabolo wind up and balancing.

Motion Capture Analysis

The motion capture data provided us a great deal of insight into the mechanics of spinning and balancing the diabolo. In order to explore the mechanics of spinning the diabolo, we focused on the normal velocity, rotational velocity, and rotational acceleration of the diabolo, as represented by the following equations:

  1. normal velocity = normal displacement/time
  2. rotational velocity = rotational displacement/time
  3. rotational acceleration = rotational velocity/time

By analyzing these three metrics, we were able to identify clearly when the juggler would jerk the diabolo string. This jerking motion would cause a sudden sharp increase in the velocity of the diabolo, which would then settle down into a gradual increase of rotational velocity – some rotational acceleration.

When analyzing the balance of the diabolo based on the hand positioning offset, we found a linear correlation; the hands’ offset would directly affect the more the diabolo’s angle of inclination.

Reflections

The main challenges we faced on this project could be directly linked back to the complexity of the diabolo motions. Creating a series of marker sets proved to be challenging because of the diabolo’s hollowed shape, which meant that the motion tracking software would often not be able to see all of the six markers on the diabolo at once. Additionally, using a diabolo requires a good deal of space, which meant, during the video capture session, many times the diabolo or the sticks would be out of the motion capture’s field of view. This required us to become inventive in our solutions, one such solution being shortening the string of the diabolo. Moreover, these issues combined meant that our motion tracking data had to contend with a decent number of dropped markers, which created an additional challenge with analyzing the data.

Team Contributions

The team that executed this project was Manuel Rodriguez Ladron de Guevara, Ariana Daly, and Sarika Bajaj. Manuel focused on the recording, processing, and analyzing of the motion capture data using Grasshopper and Python code. Sarika and Ariana both worked on creating marker jigs for the diabolo and sticks, recruiting Mr. Geyer to be our diabolo expert, the breakdown of the diabolo skill analysis, and the creation of the final video and documentation.

Acknowledgements

We would like to thank Mr. Benjamin Geyer for acting as our diabolo expert and taking time out of his day to talk with us and perform for our motion capturing session. We would also like to thank Prof. Zeglin and Prof. Bard for their insight and help in

]]>
https://courses.ideate.cmu.edu/16-455/s2018/501/diabolo-skill-analysis/feed/ 0