Progress – Human-Machine Virtuosity https://courses.ideate.cmu.edu/16-455/s2018 An exploration of skilled human gesture and design, Spring 2018. Mon, 14 May 2018 07:53:05 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.25 Tactile Abstraction (Shortest Path Prototype) https://courses.ideate.cmu.edu/16-455/s2018/664/tactile-abstraction-shortest-path-prototype/ https://courses.ideate.cmu.edu/16-455/s2018/664/tactile-abstraction-shortest-path-prototype/#respond Mon, 02 Apr 2018 15:59:05 +0000 https://courses.ideate.cmu.edu/16-455/s2018/?p=664 Continue reading Tactile Abstraction (Shortest Path Prototype) ]]> This project uses MoCap technology to leverage human tactile scanning to the end of generating an abstracted version of the form of the scanned object. It will take advantage of innate human skill of highlighting artifacts of particular tactile importance. It will then recreate an abstracted version of the scan using a hot air rework station and Polystyrene plastic.

At this point in the project, we have a solid understanding of the technical capabilities and limitations of the workflow. The primary areas of work moving forward are selecting the input objects, creating a finger mount for the mocap marker, refining the workflow, defining the parameters of translation from input information to output trajectory, creating the output mechanisms, and evaluating the output.

The choice of input objects will be given additional consideration, as we would like to have a cohesive set that maintains meaning in the process. Additionally, having multiple users scan a single object could provide very interesting insight into how the workflow adapts based on user input. We may 3D print a ring mount for the mocap marker, or use an existing ring and adhesive. The translation will rely on point density (the points being a sampling of those generated by the motion capture process), and may also take into account direction and speed of scan trajectory. Additionally, this data will be converted to robot motion that will likely need to take an “inside-out” pattern – traveling outward from a central point rather than inward from a border. The output mechanism to be created will be (i) a mount for the robot arm to hold the polystyrene sheet and move it within a given bound and (ii) a mount for the hot air rework station to keep the nozzle and box secure and stationary.

]]>
https://courses.ideate.cmu.edu/16-455/s2018/664/tactile-abstraction-shortest-path-prototype/feed/ 0
Prototype: Agent Conductor https://courses.ideate.cmu.edu/16-455/s2018/666/prototype-agent-conductor/ https://courses.ideate.cmu.edu/16-455/s2018/666/prototype-agent-conductor/#respond Mon, 02 Apr 2018 15:18:13 +0000 https://courses.ideate.cmu.edu/16-455/s2018/?p=666 Continue reading Prototype: Agent Conductor ]]> Manuel Rodriguez & Jett Vaultz

This project is a hybrid fabrication method between a human and virtual autonomous agents to develop organic meshes using MoCap technology and 3D spatial printing techniques. The user makes conducting gestures to influence the movements of the agents as they move from a starting point A to point B, using real-time visual feedback provided by a projection of the virtual workspace.

Screencap from our role-playing test, with an example of the agents’ movements and generated mesh.

 

Workflow diagram

For our shortest path prototype, we discussed what the agents’ default behaviour might look like, without any interaction or influence by the user, given starting and end points A and B. We then took some videos of what the conducting might look like given a set of agents that would progress in this default behavior from the start to end points, and developed a few sketches of what the effects on the agents would be while watching the videos. In Grasshopper, we were able to start developing a python script that dictates the movements of a set of agents progressing from one point to another in the default behavior we had established.

Sketch of the progression of agents from the top plane from the role-playing test down to the second without any kind of influence from the user.

A sketch of a possible structure based on the video, which shows a silhouette of the movements of the agents.

Our next steps moving forward would be to flesh out our python script so that we can generate a plausible 3D mesh as the agents progress from start to finish, first without any interaction. Once we have this working, we’ll work on building a more robust algorithm for incorporating user interaction to create more unique and complex structures. The resulting mesh would then be printed from one end to the other using the spacial printing tools in development by Manuel.

 

]]>
https://courses.ideate.cmu.edu/16-455/s2018/666/prototype-agent-conductor/feed/ 0