This project uses MoCap technology to leverage human tactile scanning to the end of generating an abstracted version of the form of the scanned object. It will take advantage of innate human skill of highlighting artifacts of particular tactile importance. It will then recreate an abstracted version of the scan using a hot air rework station and Polystyrene plastic.
At this point in the project, we have a solid understanding of the technical capabilities and limitations of the workflow. The primary areas of work moving forward are selecting the input objects, creating a finger mount for the mocap marker, refining the workflow, defining the parameters of translation from input information to output trajectory, creating the output mechanisms, and evaluating the output.
The choice of input objects will be given additional consideration, as we would like to have a cohesive set that maintains meaning in the process. Additionally, having multiple users scan a single object could provide very interesting insight into how the workflow adapts based on user input. We may 3D print a ring mount for the mocap marker, or use an existing ring and adhesive. The translation will rely on point density (the points being a sampling of those generated by the motion capture process), and may also take into account direction and speed of scan trajectory. Additionally, this data will be converted to robot motion that will likely need to take an “inside-out” pattern – traveling outward from a central point rather than inward from a border. The output mechanism to be created will be (i) a mount for the robot arm to hold the polystyrene sheet and move it within a given bound and (ii) a mount for the hot air rework station to keep the nozzle and box secure and stationary.