This project was an exploration of how the Kinect might be used to map pre-rendered and generative audio-responsive projections onto the faces of instruments.
The patch uses adjustable maximum and minimum Kinect depth map thresholds to isolate the desired object/projection surface, and subsequently uses the created depth map as a mask. This forces the video content to show through only in the shape of the isolated surface.
The patch is less precise on instruments which expose a large part of the body, as, for example, legs tend to inhabit similar depth thresholds as the face of a guitar; it is better suited to instruments with larger faces that obscure most of the body, such as cellos and upright basses.
In attempting to map to the surface of a guitar, I also toyed around with other uses for the patch, which include this animated depth shadow, which places the video mapped shadow of the performer on the wall, creating the potential for visual duets between any number of performers and mediated versions of their bodies.
I plan to continue exploring how to make this patch more precise on a variety of instruments, possibly by pairing this existing process with computer vision, motion tracking, and/or IR sensor elements.
For my project 1 I decided to try and make an instrument out of my computer. I separated out the keys into distinct regions and assigned them midi values based on where they were on the keyboard. I then used these note values in different modes to produce different sounds. I also included a boomerang effect that allows the user to record a short piece of audio and then the patch loops it and repeatedly plays it. I created ten drum sound effects by filtering noise in different ways. The main instrument portion is a square wave filtered in a similar way to make the note sound less harsh. The last mode is a saw tooth tremolo that repeated plays the same not so long as it is held. The launch pad is polyphonic and can export the sound in the loop buffer.
A short example piece that has been layered three times
For project 1, I used a Kinect to control the motion of a particle system using my hand. I am very interested in different applications of motion tracking and I think this was a good introduction to help me learn how the Kinect works. Here is a download link to a video showing my project in real time: IMG_1479
I used this tutorial to help me create particles from an image that I would then control using input from the Kinect. To control the Kinect I used output from the object dp.kinect2. This took me a while to set up initially. I wanted to have the system use real-time image input from the Kinect as the image – that did not end up working quite like I wanted so I stuck to using one preset image.
This patch analyzes audio in three ways and represents the information through a LED light pattern.
The 30 LED lights are grouped into three subgroups, where there’s an inner layer consist of two lights, a middle layer consist of a ring of 10 lights, and an outer layer of 18 lights. Changes of colors or brightness happen from the inner ring to the outer ring, so that the light propagates outwards.
The audio amplitude controls the brightness(saturation) of the colors. The ratio of lower frequency to higher frequency controls a color picker, which determines the RGB values. Then the values are being sent to the three layers with different amount of delay.
As I make a lot of videos and short films in my free time, anything related to processing videos excites me, so I really wanted to learn how to use the computer vision object built into Max. For this project I used the cv.jit.faces object to be able to alter a face in a movie by either blurring it or placing a virtual spotlight on it. First, I downscale the image to 1/5th of its original size, then convert it to greyscale, and run it through the cv.jit.faces object. I use the output matrix to determine the positions of the face and accordingly place a blurred image with an alpha layer that I made on top of the face or add a spotlight. I hope you like my project!
This project allows a person via the Kinect to use their hand to move around balls in a virtual ball pit. Much of this patch has been built upon some of the dp.kinect2 reference patches as well as a reference from https://cycling74.com/tutorials/00-physics-patch-a-day, integrating the two by creating a kinect system that uses the closest player’s right hand to move around the main movable physics force. Most of the work in this project involved figuring out what good bounding boxes would be in the physical world and in the virtual world/how the user would actually interact with the Kinect (I wanted the output animation to be very obviously user controlled – almost painfully so). Additionally, I had some fun changing the aesthetics of the actual ball system.
Video of the system working:
Gist of code: https://gist.github.com/anonymous/9ddab8deb04b40090d8efeb8cd0b5f06
For assignment 4, I decided to utilize the fft object and create an audio effect patch that would mimic the sound of recent favorite music genre ‘Vaporwave’.
I created a noise reduction subpatch. Furthermore, a degrading fft subpatch is also used and linked to the output, playing along with the other signal. After these two subpatches, the audio signal is then run through the original patch where it is stretched out in real time (slowed down). This is done by using the delay effect.
I also added another simple visual presentation, that is very similar to the one we made in class. A japanese city pop song was used in demonstration, to achieve that ‘vaporwave aesthetic’.
Another demo: https://soundcloud.com/thewx/assignment4-demo/s-K8iMc
This project is based on what we did in class – visualizing audio using pfft~. I parsed pfft output into four bands using bin index. Each band covers a range of frequency(low, mid-low, mid-high, high, respectively), represented by blue, green, pink and white. This video shows the visualization of Morton Gould’s Interplay: IV. Very Fast, With Verve and Gusto, a piece with very beautiful orchestration. Piano, which dominants most part of the music, lies in green and blue bands while woodwind, brass, and percussions occasionally pop up in pink and white. I also tried to run the patch with pop music, in which there seems to be a larger pink/white presence.
The video quality seems to be embarrassingly bad… I will work on it next time…
The Patch is similar to the class one, with additions of more matrices in the main patch and band filters in pfft~ subpatch. Currently the bin filters are hard coded, I’ll see if I can improve the model to adjust bin filters on the fly.
This assignment had us look at signal processing utilizing the pfft~ object.
Personally, I wanted to do something interesting with the amplitude and phase data that the pfft~ object provides that we had not yet tried. The end result was a fairly straight forward use of signal information to create a reactive video — similar to what we achieved in class — in which the amplitude and phase were used as an external effect on a noise matrix.
I chose to use noise because I found it easier to produce a particle effect using the draw points in the jit,gl.mesh object.
The patch takes the amplitude and phase data from the incoming audio — in this case, from a microphone — and captures each as a number value (using snapshot~ rather than poltocar~ to convert the signal information after some minor processing). The number value is then used as a set of parameters defining the location of an attracting force on the particles — causing them to moving around the screen.
I also added a more extreme set of attraction forces which use the amplitude and phase information to govern how strong the particles are attracted or rejected to the center of the video window. When turned on, the particles become more erratic due to the constantly changing values which limits its applications — but I like it as an effect to instantly intensify the drama of the visual effect.
I am interested in developing this patch further with a set of filters and gates to create a combination audio/visual instrument. I would also like to refine the way in which the particles are acted on by different forces to create a more fine-tuned reactive effect.