For my final project I decided to explore more ways of using the leap motion sensor to control different elements of drawing. I made a game through which the coordinates of a hand are tracked to translate to both a rotation and resizing of a square to match up with a target square. When the squares are matched sufficiently, it moves to another one. I have attached a demo of me playing this.
I was also very interested in learning more about different uses for the machine learning patch. I trained the leap motion to detect three different hand gestures: a palm facing down, a fist and a “c” for camera. As shown in the demo below, when I make a “C” with my hand, I am able to take pictures with the camera. I can then use my left hand to distort the image taken. This distortion was influenced by this tutorial.
Here is a link to all the final files I used for this project including the machine learning data and model for convenience. I also have included all the gists below.
For project 1, I used a Kinect to control the motion of a particle system using my hand. I am very interested in different applications of motion tracking and I think this was a good introduction to help me learn how the Kinect works. Here is a download link to a video showing my project in real time: IMG_1479
I used this tutorial to help me create particles from an image that I would then control using input from the Kinect. To control the Kinect I used output from the object dp.kinect2. This took me a while to set up initially. I wanted to have the system use real-time image input from the Kinect as the image – that did not end up working quite like I wanted so I stuck to using one preset image.
Here is the gist for my code:
I used the pfft object to add cool effects to an audio file based on a note clicked on the kslider. I then used the this tutorial as a template to create visuals for the audio.
Here is a link to view my video describing the project in box.
Here is my gist:
I have two ideas for projects that I am interested in.
The first involves analyzing the frequencies and rhythms from an audio file to create special effects with the lights on the ceiling in the Media Room that correspond with the tempo and other elements of the audio.
After seeing the video in class last week of a saxophone sound begin constructed, I also became interested in learning how to use Fourier transforms to reconstruct the sounds of different musical instruments. I am not sure exactly how this would work, so maybe the first idea would be more doable for Project 1.
I used the concept of time shifting by delaying a sine wave to demonstrate noise cancellation properties. By changing the number of samples by which the wave is delayed, we can shift it enough so that the shifted wave added to the original becomes 0. My code and a demonstration of the patch are shown below.
I used two phones to distort an image I took a couple years ago. Using one phone, I took a picture of the image displayed on the second phone. Then, I used AirDrop to quickly transfer the file to display the new image on the second phone and repeated the process until the image became mostly one color after 50 iterations.