akong@andrew.cmu.edu – Human-Machine Virtuosity https://courses.ideate.cmu.edu/16-455/s2019 An exploration of skilled human gesture and design, Spring 2019. Sun, 23 Feb 2020 23:44:50 +0000 en-US hourly 1 https://wordpress.org/?v=5.0.21 Paul’s Group (3/25 Update) https://courses.ideate.cmu.edu/16-455/s2019/1280/pauls-group-3-25-update/ https://courses.ideate.cmu.edu/16-455/s2019/1280/pauls-group-3-25-update/#respond Mon, 25 Mar 2019 15:15:41 +0000 https://courses.ideate.cmu.edu/16-455/s2019/?p=1280 Continue reading Paul’s Group (3/25 Update) ]]> Team Management

Kevin: Generating Rhino/Grasshopper toolpathing

Elton: Developing mounting systems for foam, Robotic arm attachments

Testing paint and medium printing on foam

Andy: Writing Python code for pattern generation from provided image

Bi Weekly Goals 

Kevin is working on the tool paths in Rhino and once he is done, we will try to apply the same algorithms to the final image Paul wants us to carve. Then we will runs simulations to ensure no collisions and then hopefully carve on a full sheet.

Final Show Vision

For the live presentation we want to display the full size carvings/prints and also have a small scale demo, maybe with a 1’x1′ print block. We’d like to have a multi-block color print or something beyond just outlines for the smaller print. We will have videos playing of the robot carving at the foam since we won’t have the actual robot, and display the hot knife setup that Paul and the robot used.

]]>
https://courses.ideate.cmu.edu/16-455/s2019/1280/pauls-group-3-25-update/feed/ 0
Shortest Path Prototype – Paul Foam Knife Group https://courses.ideate.cmu.edu/16-455/s2019/1178/shortest-path-prototype-paul-foam-knife-group/ https://courses.ideate.cmu.edu/16-455/s2019/1178/shortest-path-prototype-paul-foam-knife-group/#respond Tue, 19 Mar 2019 23:06:01 +0000 https://courses.ideate.cmu.edu/16-455/s2019/?p=1178 Continue reading Shortest Path Prototype – Paul Foam Knife Group ]]> Our group’s plan moving forward was to get a MOCAP of Paul working with the hot knife, editing and reproducing his stroke pattern through Rhino, and having the robot replicate his dextrous skill in carving out a completely different drawing. We’ve made significant progress towards that end.

Paul appeared to have an easy time transferring his woodcarving skills to foam. He drew a flower on a block of foam and went to work.

We weren’t able to MOCAP his motion since we hadn’t created a mount for the reflective spheres yet, so Paul drew up a quick picture of a tree and some cubes for us to experiment on before he left. We took the scanned image and converted it into vector graphics before dropping it into Rhino to see how the robot arm pathing would look.

It was incredibly messy. The scan captured way too much detail in the drawing, and Rhino interpreted every little bump in the lines as a new “path” for the robotic arm to follow. We ended up with a few hundred strokes just for minor detailing in the image, so we decided to simplify the image significantly. We also used a smoothing function in Grasshopper to remedy the tiny bumps in the final drawing.

Since we had to cut out an inverse in order to make a printing piece of foam, the generated pathing avoids the outer edge of each line and enclosed body.

Since we’re trying to extend and expand on what Paul can do artistically, we created a Python program that acts as a pattern paint bucket. Much of the detailing of Paul’s work comes from filling in white space of an outline, and enhances the drawing dramatically. The program reads in patterns and a main image, and the user decides which regions should have which patterns. Clicking on the display’s regions fills them in, and the edited picture is exportable into vector format. Paul can use this to decide what parts of the final carving he would rather the robot do.

Striped shading on one of the boxes in the image

We’ll ask Paul what kinds of patterns he wants, as well as any other edits he might want to see done to the image. Currently, we just have a striped pattern as a proof-of-concept for the pattern fill.

This program is not integrated with the RAPID code generation yet; currently the picture has to be manually converted to an SVG after the program changes the image, and then processed into Rhino. We’re working on automating this process so that everything after the scan and artist-selected patterning will be automated.

On the physical side of our project, we’ve created a mount for the tool and the foam block. Using one of the orange robot tool plates, we mounted two laser-cut pieces of wood to hold the hot knife in place with a system of pegs. The hot knife can be easily removed with a wrench, allowing Paul and the robot to collaboratively share the knife without having to recalibrate the tool’s location with respect to the arm every time it is dismounted.

Hot knife mount in CAD
Hot knife mount with knife

We still had a manufacturing constraint that the hot knife couldn’t be turned on for periods longer than ~30 seconds safely, for fear of the transformer blowing out or the knife itself bending out of shape. While a pneumatic trigger would work, re-prototyping another hot knife mount would have taken too much time. Garth told us about PulseDO, a command in RAPID that can turn on and off the power supply. We plan on pulsing the power to the tool to make sure it doesn’t get too hot, but for now we can manually toggle the power from the control box.

First test of robot arm cutting foam with the hot knife

We created a quick script in RAPID to test how well the robot arm could wield the hot knife. After a period of pre-heating, the robot had no problems cutting through the foam, even during very deep cuts. However, we realized that some of the pieces of foam would cool and re-stick in their slots after cutting. We settled on mounting the final foam block standing up, so that any cut-out pieces of foam would just fall out of their slots. We designed a CAD model of the mounts and assembled the laser-cut final pieces.

Reflection on next steps

Currently our prototyping runs pretty smoothly. The only parts that have to be done manually and can be automated are scanning the work that Paul wants to render in the foam, channeling the output from the program into Rhino, then exporting the RAPID code to the robot. After Paul makes his design choices, the process should just automatically spit out RAPID code to work with, minimizing the manpower required to perform the intermediates. But that’s a later improvement.

Currently we have no pattern templates from Paul, which are essential if we want to apply his chosen textures onto a picture. This would be best transferred through some kind of picture file that could be repeatedly tiled, and the Python program would have to be modified to include a number of constraints on where the patterning goes, i.e. density, separation, border size. With the base program done this wouldn’t take too long.

The hot knife tool we attached to the robot arm can cut at any radius up to ~2 inches, but the tool pathing generation in Grasshopper does not utilize this potential of the knife. By adding a depth-to-width cutting awareness depending on how wide the perpendicular is, we could cut down on carving time, as well as get a more accurate trench between converging lines. This could also be adapted to cut using a different edge for large swathes of blank foam, which require less accuracy and more sweeping cuts.

The tool’s mount is pretty stable for now, as is the foam mount. Since we want to carve on a standing piece of foam so that the scrap pieces fall out, the foam mount is essential. We need to experiment with the gaps sizing, but with 4 mounts the setup is stable enough to carve on.

Finally, we’d like to give Paul more options with regard to modifications to the art he would like to carve and the art he’d like the robot to carve. Right now we have patterning and straight line down, but we need to consult with him about tasks he finds difficult in the process that could be automated. Alternatively, we could look at hybrid art pieces that have strokes that are intended to have a different style in the way that they’re carved, allowing Paul and the robot to work together on one canvas.

]]>
https://courses.ideate.cmu.edu/16-455/s2019/1178/shortest-path-prototype-paul-foam-knife-group/feed/ 0
P2.1 Paul Final https://courses.ideate.cmu.edu/16-455/s2019/1100/p2-1-paul-group-2/ https://courses.ideate.cmu.edu/16-455/s2019/1100/p2-1-paul-group-2/#respond Wed, 13 Feb 2019 15:25:29 +0000 https://courses.ideate.cmu.edu/16-455/s2019/?p=1100 Continue reading P2.1 Paul Final ]]> Hybrid Flow Diagram

In supporting Paul’s move to faster and larger print setups, we wanted to enable his prints to maintain their same level of detail even in a blown up scale. He suggested doing large prints on the sides of abandoned buildings or empty parking lots which is doable with adjustments to his process. Large carvings would take a lot more material and time especially if using wood, at the same time making it difficult to set up printing on uneven and large surfaces. We think using foam sheets and a hot knife as the tool would enable the large-scale carving of surfaces.

Our project aims to get a mocap of Paul using a hot knife tool to carve into foam, then program a robotic arm to use his technique on a drawing. We also want the solid outlines of the drawing to be filled with repeating patterning programmatically, on a finer level than would be allowed for a scaled image. Ideally, we’d have multiple patterns supplied by Paul so we can add whatever he thinks is best on the front.

In the end, we should have some programs which are able to take in a mocap of a technique and replicate it at whatever scale we’d like to render a drawing. These sheets could be used repeatedly by Paul on any surface he desires.

After talking to Paul on Monday and considering our options, we went ahead and put in an order for a hot knife. While this isn’t Paul’s usual domain, his adoption of a new tool and new technique should reflect his previous technique of woodcutting. We will mount markers on this cutter and proceed with the tool modelling. The robot arm can then proceed through a similar technique as recorded from Paul.

]]>
https://courses.ideate.cmu.edu/16-455/s2019/1100/p2-1-paul-group-2/feed/ 0
P1: Scissors Motion https://courses.ideate.cmu.edu/16-455/s2019/928/exercise-p1-post-template/ https://courses.ideate.cmu.edu/16-455/s2019/928/exercise-p1-post-template/#respond Mon, 28 Jan 2019 03:00:45 +0000 https://courses.ideate.cmu.edu/16-455/s2019/?p=928 Continue reading P1: Scissors Motion ]]> Andy Kong & Gerardo Lira

The tool that we used was a scissor. In order to capture the motion of this object, it was necessary that we place markers on locations where they would not change the plane orientation. Therefore, we decided to place one marker beside the hinge and two markers behind the gripping holes. In order to place two markers behind the scissors, we used a makeshift acrylic extension and attached it the bottom handle. This attachment had holes which allowed us to screw the markers in place.

We noticed some interesting things in the movement of our scissors:

  1. The scissors continuously changed velocity as it was tilting during the driving cut. Although the rigid CAD model doesn’t do it justice, the brief pauses in speed can be seen in the video. These passes denote when the scissors would slow down to make a cut.
  2. It is interesting to see how the hand orients the scissors as it makes the initial cut and drives through the sheet of paper it’s cutting. During the initial cut, the hand is parallel to the table axis. Once we begin driving the cut through the sheet of paper, the handle tilts up and down to gather momentum and continue through.
  3. The scissors were used to cut a regular sheet of paper. We chose a paper to cut because we felt it could best capture the motion of the scissors within a short period of time. In order to not block out the markers during our cut we had to grip them in a unnatural way.
  4. During the entire cutting motion, there were multiples inflection points. Although the scissors were translating along the table axis, the inflection points was

In the recreated video we made from the Motive capture, the non-smoothness of the cut we made is evident from the stopping and stalling motion of the scissors. Some cuts were actual scissor cuts which we hoped would turn into a gliding cut, but others were just normal scissor operation. The dexterity of scissor cuts was not well captured, since the scissors act like two pieces and we had only enough markers to define one piece of it. Despite this, the quick slides of the scissors moving forward is realistic of the smoother gliding cuts.

The stuttering of the scissors in the capture was caused by the awkward grip we had to use them with. We didn’t want to block and markers with our hands, yet still had to use the scissors normally.

GIF of the entire scissors capture. One piece of paper is cut in its length in this animation.
Video of the entire scissors capture.

]]>
https://courses.ideate.cmu.edu/16-455/s2019/928/exercise-p1-post-template/feed/ 0