For my first project, I utilized a Microsoft Kinect to gather movement information on a person, within a space, and apply the tracking information to parameters on a granular synthesis. At the end of the pipeline, the resulting sound was then processed into a Lissajous pattern and project onto a wall behind the performance space.
To farm the information from the Kinect, I used Processing, an open-sourced graphical library, which was programmed to collect data from the Kinect, parse out only the left hand, right hand, and head X-Y Coordinates. Then, using OSC I sent this bundle of coordinates to Max MSP.
Next, I scaled these numbers to be useful by the granular synthesis parameters.
Originally, I started with a rather bland audio file to be processed, however, I realized soon that I would need a more complex sound file to really accentuate the differences in the granular synthesis. To do this, I used a file of wind chimes, which proved to be much more textual and rewarding.
Below is a video of the piece in work, as well as the max patch and processing files.