Aditi – Physical Computing https://courses.ideate.cmu.edu/16-223/f2014 Carnegie Mellon University, IDeATe Fri, 11 Aug 2017 21:41:33 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 Tutorial: Leap Motion to PureData https://courses.ideate.cmu.edu/16-223/f2014/tutorial-leap-motion-to-puredata/ Tue, 16 Dec 2014 16:39:35 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3579 Introduction 

The Leap Motion is a sensor that can be used to read hand gestures. It provides spatial coordinates from each joint on each finger. If you’re interested in using the Leap Motion sensor with puredata to control sound through gesture, this tutorial will provide a guide for doing so.

Leap Motion Setup: 

Follow the instructions here: https://www.leapmotion.com/setup. You don’t need to get the developer version if you are just using it with puredata, but it could be useful for other projects.

Mac users: 

There is a pd external found here: http://puredatajapan.info/?page_id=1514 that reads the leap motion data and also provides some other useful calculations other than just spatial coordinates (velocities, palm normals, etc). This can be used in any of your own patches. Once it is compiled on your computer, copy the leapmotion-help.pd file, along with gesture.pd, hand.pd, and point.pd into the the same directory as your patch. Create a leapmotion-help object in your patch, and open it up so it is running at the same time.  You’ll notice that the leapmotion-help patch is using the puredata send objects to send data about hands/gestures/general info/tools. In your own patch, you can receive these messages using the receive object with the name of what you want. You can print these messages to see how they are formatted and then use several route and unpack objects to get the exact values you want. An example for getting the velocity of a hand is shown below, and for getting the “first” finger’s velocity.

lmexternal

Alternatives: 

I used the ManosOSC  app available on the leap motion app store (https://apps.leapmotion.com/apps/manososc). This app streams OpenSoundControl data that pd can get through a UDP connection. This uses the mrpeach library just like the tutorials on using the smartphone apps to send OSC data. Just make sure to change the port to port to the same one that the ManosOSC app is sending on. The default is 7110.The example below is for getting the xyz coordinates of the tips of the first two fingers the leapmotion recognizes.

manososc

 

The actual pd file with the code in the screenshots can be found here: https://github.com/aditisar/oobleck/blob/master/leappdtutorial.pd

My final project used ManosOSC and tried to track a finger’s direction to control frequency/amplitude of a speaker. The code for that can be found here: https://github.com/aditisar/oobleck/blob/master/finalproject.pd

]]>
Final Project – Non-Newtonian Composition https://courses.ideate.cmu.edu/16-223/f2014/final-project-non-newtonian-composition/ Wed, 10 Dec 2014 17:04:43 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3249 Group members/Roles: Aditi Sarkar and Becca Epstein as Tutors, Integrators, Designers and Scribes

Introduction

Our project explores the properties of non-Newtonian fluids, specifically oobleck. We created a theater and manipulated the stage to see the different forms and motions it created in the oobleck. Hand gestures vary the vibrations and act as conductors to the performance. Ideally a mechanism for adding color based on gesture would have been included as well, but it was added manually to track the movement of the oobleck.

Video

Fabrication Notes

After experimenting with the oobleck at different frequencies, we found that it “danced” the most at very low frequencies (20-40Hz). We used a subwoofer speaker so that we could get the most movement output at these frequencies. We fastened a tin box lid to the mouth of the speaker with a single screw through the center of both. The outer structure was made with acrylic and wooden dowels, with a stocking stretched across the top to allow for colored powder to drift down.

Technical Notes

Our project used a subwoofer, amplifier, and a Leap Motion sensor. We used the gesture data from the Leap Motion through PureData to control the frequency and amplitude of the speakers, and a leapmotion app called ManosOSC  (https://apps.leapmotion.com/apps/manososc) to get xyz coordinates on each finger joint. Only finger controlled frequency and amplitude in this iteration, but ideally we would have richer control with more natural/conductor like gestures. We actually found a Leap Motion pd external that calculates useful gesture data apart from just spatial coordinates at http://puredatajapan.info/?page_id=1514, but didn’t have enough time to explore it. Our pure data file can be found here: https://github.com/aditisar/oobleck.

s

Future Iterations

A future iteration of this project could include several speakers with oobleck dancing from stage to stage, with more complex gestures from the conductor. We would like to explore the full capabilities of the Leap Motion  – this version only had movement in the xy plane dictating frequency and amplitude. Although we read pinching and slamming gestures, we didn’t get to map them to other parts of the performance. We would also like to add a controlled mechanism for releasing color into the oobleck.

Photos

gloop2

withCeiling

gloop

colors

aditi

 

]]>
Final Project Sketch – Composition with Non-Newtonian Fluid https://courses.ideate.cmu.edu/16-223/f2014/final-project-part-1-composition-with-non-newtonian-fluid/ Mon, 01 Dec 2014 16:00:11 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3115 Team Members: Aditi Sarkar  and Becca Epstein

We are creating an ascetically dynamic piece by exploring the material properties of Oobleck. Oobleck is a non-Newtonian fluid made of cornstarch and water. We will use pure data reacting to proximity sensors to project sound waves which will cause the movement of the Oobleck on the speaker. When the Pitch gets very high (people have gathered around the piece triggering the proximity sensor) drops of color will be squeezed by a servo into the rumbling Oobleck. We are experimenting with translating the environment around our object into the behavior of this abstract piece.

Sketches:

sketch 1

sketch 2

]]>
Autonomous Robot Part 3 – Drawing Bot https://courses.ideate.cmu.edu/16-223/f2014/autonomous-robot-2c-drawing-bot/ Wed, 19 Nov 2014 03:42:31 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=3037 Group Members: Aditi Sarkar, Alice Borie, Claire Hentschker

Designer: Aditi Sarkar, Claire Hentschker, and Alice Borie as Integrators, Designers, Tutors, and Scribes

Introduction

We wanted to create an autonomous drawing robot that altered its drawing in response to lines it had already created. To do so, we created a box on wheels with an arm that made marks on the table with a dry erase marker. There was a sensor at the tip of the marker, and when it sensed a line had been made in that area, it reversed its own direction temporarily in order to change the pattern being drawn.

Video

Technical Notes

Technical details: We used one gearhead motor, used with the DRV8833 motor driver on the arduino’s pwm pins to control direction. We also used the QTR-1RC reflectance sensor to check whether the bot passes over a place it has already drawn. The arduino code can be found here:https://github.com/aditisar/drawingbot/blob/master/drawingbot/drawingbot.ino

Fabrication details: We used Rhino to model the box and the arm. We used colorful acrylic, which gave it a toy like appearance and used zip ties to hold the motor to the box. We used shaft collars on the hinge of the arm and screwed the arm onto the motor.


First iteration: We didn’t have the sensor working/connected to anything – for our first demo, we just had a back and forth motion with the motor for testing puerposes to see how the robot would handle the moving arm. We ended up liking the spirograph like patterns it created and decided to keep it for the second iteration. One of the big problems with the first iteration was our inability to test it without taking it apart, so we made some changes to make testing easier for the second iteration.

Second iteration: We attached the sensor, added an on/off switch, and a hole for the mini usb to plug into the arduino. During the actual demo, we had battery issues that made the robot pause and start in unpredictable intervals. We liked this behavior and the child-like behavior of the robot, and will work it into the final iteration.

Third iteration: We fixed our battery issue and refined the code for cleaner responses.

Schematic and Code

IMG

Pictures

db3 db2db5DB4

]]>
1A – Basic Circuits Project – Pay-Per-Speech, Freemium Speech https://courses.ideate.cmu.edu/16-223/f2014/1a-basic-circuits-project-pay-per-speech-freemium-speech/ https://courses.ideate.cmu.edu/16-223/f2014/1a-basic-circuits-project-pay-per-speech-freemium-speech/#comments Wed, 10 Sep 2014 08:10:57 +0000 http://courses.ideate.cmu.edu/physcomp/f14/16-223/?p=1206 Group Members: Miles Peyton, Horace Hou, Rachel Ciaverella, Aditi Sarkar

Roles: Miles Peyton as Tutor, Horace Hou as Integrator, Rachel Ciaverella as Designer, Aditi Sarkar as Scribe

Introduction

We think speech is free in the same way that Facebook is free: speech is freemium. Along these lines, we created pay-per-speech, a freemium megaphone for activists. Its output volume is governed by the number of coins it receives, i.e. it amplifies the most when it is fed a high number of coins in a short period of time.

Pay-per-speech uses currency as input, and sound as output. There are two main components to the device: the coin acceptor and the megaphone. These devices are unified by a bulb and photocell. More precisely, the coin acceptor generates a voltage that powers a lamp, whose brightness is measured by a photocell that in turn controls the volume of the megaphone.

Technical Notes

The form for our project was partially predetermined by the shapes of the megaphone and the coin detector. We decided on making the coin collector into a heavy necklace meant to evoke the feeling of constraints of shackles of some sort. The transparent wire tubing we used isn’t very comfortable, and cuts slightly into the user’s neck when worn. We chose to spray paint the megaphone white and used a clear bag as the receptacle for coins to give the item a sterile and corporate feel.

We used two separate circuits to create the project – one dealing with the output from the coin acceptor and one modifying the volume level based on a photoresistor’s reading. The coin acceptor outputs a voltage every time a coin is put in. These voltages control how brightly the bulb is burning – every time a coin is added, the bulb dims. This is read by a photoresistor, which we used to replace the existing 4-pin potentiometer that controlled the volume in the megaphone.

Here is the circuit diagram for the coin acceptor and light bulb:

Coin acceptor circuit

And one for the photoresistor and megaphone:

 

Megaphone circuit schema

Photos

 

free_speech_megaphone

Megaphone

Megaphone and moneybag

Megaphone and moneybag

 

The moneybag

The moneybag

 

]]>
https://courses.ideate.cmu.edu/16-223/f2014/1a-basic-circuits-project-pay-per-speech-freemium-speech/feed/ 1