I took Project 1 as an opportunity to explore controlling systems using mere hand gestures. I used a leap motion device to detect my hand gestures and movements to control the different aspects of granular synthesis of an audio signal. While one hand controlled the pitch rate, grain size and speed of the synthesis, the motion of the other hand was used to choose an audio file on which the synthesis was done.
Here is a short video:
I created the main patch entirely from scratch. I used a modified sugarSynth patch as a sub-patch and also used the leap-motion patches for collecting and routing data.
This is the gist of my main patch:
This is the modified sugarSynth patch:
I modified the visual subpath and fingers subpath in the leap motion patch:
- Visual sub patch
- Fingers sub patch