18-090 https://courses.ideate.cmu.edu/18-090/f2018 Twisted Signals Tue, 18 Dec 2018 03:33:34 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.25 https://i1.wp.com/courses.ideate.cmu.edu/18-090/f2018/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 18-090 https://courses.ideate.cmu.edu/18-090/f2018 32 32 115419400 Project 2: Sound Artifact https://courses.ideate.cmu.edu/18-090/f2018/2018/12/12/project-2-sound-artifact/ Thu, 13 Dec 2018 04:32:27 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2586 Project 2: Sound Object

 

for my final project I wanted to create an object that made different noises depending on position and handling; but I wanted to take it away from the realm of functional instruments.

To do this, I used captivate touch sensing by wiring up an arduino, and sent data through serial to a maxpatch that manipulated the input into sound.

video of use

 

 

 

other sound

 

 

 

Construction:

 

I constructed the form out of polymer clay; which i sculpted to fit my hand; and a magnet to allow it to fit together while being able to be opened. I wanted to stay away from rapid prototyping techniques such as 3d printing and lasercutting because i felt that they take away from the organic result i was trying to achieve, and would place it into a context i wasn’t interested in.

 

I created the circuit by soldering resistors and wires onto pins that fit easily into the breadboard on the arduino. The arduino is removable but the wires and copper tape are built into the shape

 

maxpatch:

  

download

Arduino Code:

#include <CapacitiveSensor.h>

CapacitiveSensor cs_4_2 = CapacitiveSensor(4,2);
CapacitiveSensor cs_4_3 = CapacitiveSensor(4,3);
CapacitiveSensor cs_4_5 = CapacitiveSensor(4,5);
CapacitiveSensor cs_4_6 = CapacitiveSensor(4,6);
CapacitiveSensor cs_4_7 = CapacitiveSensor(4,7);
CapacitiveSensor cs_4_8 = CapacitiveSensor(4,8);
CapacitiveSensor cs_4_9 = CapacitiveSensor(4,9);
CapacitiveSensor cs_4_10 = CapacitiveSensor(4,10);
CapacitiveSensor cs_4_11 = CapacitiveSensor(4,11);
CapacitiveSensor cs_4_12 = CapacitiveSensor(4,12);

void setup(){

cs_4_2.set_CS_AutocaL_Millis(0xFFFFFFFF); // turn off autocalibrate on channel 1 – just as an example Serial.begin(9600);
cs_4_3.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_5.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_6.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_7.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_8.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_9.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_10.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_11.set_CS_AutocaL_Millis(0xFFFFFFFF);
cs_4_12.set_CS_AutocaL_Millis(0xFFFFFFFF);

Serial.begin(9600);
}

void loop(){

long start = millis();

long total2 = cs_4_2.capacitiveSensor(1);
long total3 = cs_4_3.capacitiveSensor(1);
long total5 = cs_4_5.capacitiveSensor(1);
long total6 = cs_4_6.capacitiveSensor(1);
long total7 = cs_4_7.capacitiveSensor(1);
long total8 = cs_4_8.capacitiveSensor(1);
long total9 = cs_4_9.capacitiveSensor(1);
long total10 = cs_4_10.capacitiveSensor(1);
long total11 = cs_4_11.capacitiveSensor(1);
long total12 = cs_4_12.capacitiveSensor(1);

//long total9 = cs_8_9.capacitiveSensor(1);

//Serial.print(millis() – start); // check on performance in milliseconds

Serial.write(99);
Serial.write(98);
printLimit(total2, 100, 2);
printLimit(total3, 100, 3);
printLimit(total5, 100, 5);
printLimit(total6, 100, 6);
printLimit(total7, 100, 7);
printLimit(total8, 100, 8);
printLimit(total9, 100, 9);
printLimit(total10, 100, 10);
printLimit(total11, 100, 11);
printLimit(total12, 100, 12);

//delay(5); // arbitrary delay to limit data to serial port

}

void printLimit(long val, int thresh,int yes){
if (val > thresh){
Serial.write(val);
} else {
Serial.write(0);
}
}

download

 

 

[will refine documentation on friday]

]]>
2586
Project 2 – Help Wanted https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/project-2-help-wanted/ Tue, 11 Dec 2018 02:05:57 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2549 The idea behind my project was to make a system that could visualize or represent posts on the subreddit /r/relationships. These posts are often pretty emotionally charged and an interesting blend between general and universal to highly specific. Keeping in mind that most people post their with the hope of solving issues within their personal life, my conception was to use Python to generate a series of sentiment analysis scores from the posts. Said scores that would then be used to modulate different parameters on instruments to produce a sound that dynamically evolves with the emotional content of each post. I overall achieved my goal, and while the sound is not quite as melodious as I might have hoped, I like to think that reflects just how emotionally charged and ambigious most relationships can be.

The project was split into two components, a Python pipeline that received new posts from /r/relationships and updated two big CSV files, and a system of Max patches that generated audio based on recieved input from those files. I’ll go into detail for my process for both below:

Python Pipeline:

The result of running posts through the pipeline generates 2 large space delimited files, one of scores and one of titles features. Each post has 6 scores, which were the 5 highest value (in terms of magnitude) sentiment scores corresponding to what was interpreted as the 5 words with the most emotion beyond it, and the overall average sentiment of the entire text, which was the 6th score. (Format: 1st highest score, 2nd highest score and so forth…, followed by Overall score) Title extraction was based on a rule of /r/relationships, which specifies every post must mention gender and age of both the poster and any people referred to within the post. Even with this in mind, this part was messy. While not all posts were simply between two people, after analysis around 83% of them were, so to avoid issues with input inconsistency later in Max, I removed posts that didn’t correspond to this two person rule. Even then however, there were issues with inconsistent formatting (not everyone follows subreddit rules) that my pipeline couldn’t parse properly, so if values seemed malformed I set up a default value (the tuple (30,-1,30,-1)) to substitute those values. (Format: Age,Gender,Age,Gender, where Male was represented by -1 and Female by -2)

A sidenote: the choice to only keep posts between two people lead to a bit of a change in focus; while not all initial posts were focused on relationships with significant others, a good amount of two-people posts were. I decided to incorporate a piano rendition of the main melody from Fatima Yamaha’s “What’s a Girl to Do”, as the track feels like it’s about the tension and ambiguity inherent in relationships, something powerful that I wanted to try and capture.

There are four files in the python zip. Before running any of them, you must set up your own reddit account and OAuth permissions and install some python modules: PRAW, reddit’s API, numpy,glob, and nltk.vader (you may have to install the whole package for nltk but it’s a lot of stuff, so I would try to only install vader first and install all of it if there are issues).To set up an OAuth token for reddit, follow this tutorial here: https://praw.readthedocs.io/en/latest/getting_started/authentication.html .
1) testq.py : go in and edit the four values in snag() to set up your own bot. Run in terminal with the destination for three output files (titles,text,url) to output a stream of the most recent posts broken into titles, text, and urls into those three different files. For reference, my command when running it looks something like this:

python3 /Users/lawrencehan/PycharmProjects/reddit/testq.py /Users/lawrencehan/Desktop/project_one/titles.txt /Users/lawrencehan/Desktop/project_one/text.txt /Users/lawrencehan/Desktop/project_one/url.txt

2) senti.py: Next, take the files you’ve just output and run them with senti. Senti should output three files, one for the title and two for the 5 scores and overall score respectively.

I ran these previous files several time before running the latter two while deciding how exactly I would get all the data into Max, which meant a lot of output that was split up into different files

3) cleanup.py:  This should output two csv’s containing title representations and scores of all files written within the directory. The second zip attached to this post contains all the non python files, with directories already generated for titles and scores. I would recommend you do all this processing within that folder. When you run the previous two python files, you’ll get 3 outputs that are consolidated in this step into files that you will always write to if you want to update them with more scores.

4) final.py: Max has a problem with recognizing commas in text input. That, along with formatting issues, meant that this file was needed within the pipeline as well. This converts csv files into regular space delimited files and ensures that the last number of each line is not rendered as a string, both issues that strongly affect how Max reads in these files.

After running each step of the pipeline, you should have two csv’s that look something like this:

Text editor view:

Excel/Numbers view:

Max/MSP:

Before beginning this process, I’d known by this point I wanted to incorporate the Yamaha track melody in same way. Because most sounds generated by Max sound pretty synthetic, I decided to render the melody with piano, to provide a counterpoint of sorts. I decided to use a kickdrum as my main percussive element and a low end bass drone as well as a high pitched reedy sounding drone for higher frequency content. I settled on these elements because they were relatively static and easier to hear the effects of modulation upon. Upon generating these instruments, I connected them all within a main patch. Each instrument took in some combination of scores and age/gender information, which were then used to modulate different interior parameters like filter resonance and distortion levels. Many of this score/gender information was ramped as it updated from one value to the next, which allowed for smoother transitions and avoid “clicks” from rapid changes.

When opening the Max Patch, load in alltitles,allscores,and the fatima piano loop in that order.

The output is very noisy and chaotic; here’s a sample of it below:

 

And here are the files you need to make this work:

reddit

project_two 2

]]>
2549
Project 2 – You’re a Twinkle Star! https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/project-2-youre-a-twinkle-star/ Mon, 10 Dec 2018 15:36:46 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2541 For this project, Alec and I decided to merge our efforts of audiovisualization and pitch correction. We realized that the pitch correction system Alec created could be used to pinpoint a user’s pitch and use that value to manipulate variables within the visualization, so we decided to make a Rock Band-style game where the player sings along to none other than Twinkle Twinkle Little Star.

My contribution to the project was primarily on the visual end. I took in the variables that Alec’s patcher gave me and represented them using jit.gl.sketch and jit.gl.text within a js object. In addition to the point cloud that expands whenever the player sings, I modified the particle system to change the hue of the particles in correspondence with the note sung by the player. At the bottom of the screen, I added a player cursor – which has its y-position determined by the note sung by the player and a fixed-length tail that shows the past sung notes – and a scrolling bar of upcoming notes in the song. I then added a score counter and a method of state-switching between gameplay and game over screens.

This Drive folder has my contributions, including all of the javascript class files,

and this Drive folder hold all the files for our project as a whole.

Here’s a gist for the visualization patcher, although it won’t be of much use without the js files:

 

]]>
2541
Project 2 – Twinkle Twinkle Little Rockstar https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/project-2-twinkle-twinkle-little-rockstar/ Mon, 10 Dec 2018 14:28:19 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2532 For this project, Bo and I decided to merge some aspects of our project 1 assignments, mine being an autotuner patch and his being an audiovisualizer, to create a Rockband-esque game that tracks what note you’re singing and scores it based on if you’re being corrected by the autotuner to the correct pitch in Twinkle Twinkle Little Star.

My personal contribution to this project consisted of supplying Bo’s audiovisualizer with any necessary info about the game logic, markers, current/upcoming notes, color values to display, etc. that could be variant based on how the player is doing within the game. I also created an easy and a hard mode for the game, with easy mode correcting you to only notes in the major scale of the key you’re playing the game in and hard mode correcting you to a chromatic scale no matter what key you’re in, making your margin of error slightly larger. As mentioned, the game can be played in every different major key, so users with different vocal ranges can play the game.

In addition, any of the audio heard throughout the duration of the game is coming from my patch, be it the piano playing the MIDI notes along with you or the autotuned signal. The autotuned signal is located in the “inputs” encapsulation, which also includes the game controller bpatcher I created, and within the “inputs” encapsulation is the “twinkle” encapsulation, which processes and outputs all of the variables specific to Twinkle Twinkle Little Star in a manner that Bo’s patch can then work with and draw from.

When it comes to game logic, we score based on how many beats you sing correctly (including however many the autotuner helps you to get), and the score is displayed on the game controller. You can then start a new game in whichever key you select. Regardless of what key you choose, there is a four beat lead-in playing the first note at tempo so you can get ready before the game round begins.

To demonstrate, I put myself through the ringer on easy mode:

This folder includes my contributions:

https://drive.google.com/drive/folders/1Fx1sD51GNGYjVY1dpQXl9M3f8qIQ4ZKS?usp=sharing

This folder contains the overall project:

https://drive.google.com/drive/folders/1LLSwpto8pGiUxBWq3klx1izvc2katNzs?usp=sharing

And here is my code:

]]>
2532
Project 2 Sound Spatialization https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/project-2-sound-spatialization/ Mon, 10 Dec 2018 09:06:12 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2526 For project 2 I wanted to take advantage of the Media Lab’s 8-channel sound system to create an immersive experience for a listener. Using the HOA Library, I generate lissajous patterns in sonic space as well as allow a user to control the exact placement of one of the three sounds.

In order to emphasize the movement of the sounds, I also have uplighting for the loudspeakers, where each sound corresponds to either red, green, or blue and the moving average amplitude of a signal coming out of a speaker dictates the color values for the lights.

The sounds that are played include sounds made from granular synthesis using parameters based on an accelerometer sent through a Raspberry Pi as well as other effects applied to audio files controlled by a Seaboard (done by Ramin Akhavijou and Rob Keller).

This Google Drive folder includes all of the Max Patches for our project.
https://drive.google.com/open?id=1WZH1nr-ARBmZOF9gPrks3_Oh1Q5mJTS8
The top-level patch that incorporates everything with a (somewhat organized) presentation view is main.maxpat. Most of my work (aside from putting together main.maxpat) is in ambisonics.maxpat, which in turn has several subpatches and sub-subpatches and so on.
ambisonics.maxpat is what receives sounds, sound position coordinates, and outputs data to speakers and lights. poly voicecontrol is where the positioning of an individual sound is handled. placement.maxpat calculates the position for a sound (using coord.maxpat) and spatialize.maxpat contains calls to the HOA Library to calculate the signals that should come out of each speaker channel. These are sent to poly light to calculate the light channel value and write it into the appropriate cell of a global matrix. The global matrix is accessed in ambisonics.maxpat to the lights.

 

Here’s a rough video of our project in action

]]>
2526
Ramin Akhavijou- Project 2- Project sPiral https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/ramin-akhavijou-project-2-project-spiral/ Mon, 10 Dec 2018 07:00:18 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2521 In this project, I used an accelerometer to get triple axis data in order to control different parameters in music and light. For receiving the data from the accelerometer, I connected it to the Raspberry Pi (The python code for Pi and accelerometer is written hereunder). After getting data from the microcontroller, I sent the data to my computer using wifi. In order to do that, I added some python codes which connects the Pi and computer to the same network and port. Next, I converted the received data to another format which is readable for Max by using itoa. Then, I used “fromsymbole object” in order to convert the symbol to numeric data. By using unpack, I was able to get xyz data from the accelerometer to my computer. Moreover, I helped in some parts of the music patch to have an acceptable sound which interacts with light as well.

 

 

 

Here is the python code for accelerometer:

import time
import board
import busio
import adafruit_mma8451

Initialize I2C bus.

i2c = busio.I2C(board.SCL, board.SDA)

Initialize MMA8451 module.

sensor = adafruit_mma8451.MMA8451(i2c)

Optionally change the address if it’s not the default:

#sensor = adafruit_mma8451.MMA8451(i2c, address=0x1C)

Optionally change the range from its default of +/-4G:

#sensor.range = adafruit_mma8451.RANGE_2G # +/- 2G
#sensor.range = adafruit_mma8451.RANGE_4G # +/- 4G (default)
#sensor.range = adafruit_mma8451.RANGE_8G # +/- 8G

Optionally change the data rate from its default of 800hz:

#sensor.data_rate = adafruit_mma8451.DATARATE_800HZ # 800Hz (default)
#sensor.data_rate = adafruit_mma8451.DATARATE_400HZ # 400Hz
#sensor.data_rate = adafruit_mma8451.DATARATE_200HZ # 200Hz
#sensor.data_rate = adafruit_mma8451.DATARATE_100HZ # 100Hz
#sensor.data_rate = adafruit_mma8451.DATARATE_50HZ # 50Hz
#sensor.data_rate = adafruit_mma8451.DATARATE_12_5HZ # 12.5Hz
#sensor.data_rate = adafruit_mma8451.DATARATE_6_25HZ # 6.25Hz
#sensor.data_rate = adafruit_mma8451.DATARATE_1_56HZ # 1.56Hz

Main loop to print the acceleration and orientation every second.

while True:
x, y, z = sensor.acceleration
print(‘Acceleration: x={0:0.3f}m/s^2 y={1:0.3f}m/s^2 z={2:0.3f}m/s^2’.format(x, y, z))
orientation = sensor.orientation

Orientation is one of these values:

– PL_PUF: Portrait, up, front

– PL_PUB: Portrait, up, back

– PL_PDF: Portrait, down, front

– PL_PDB: Portrait, down, back

– PL_LRF: Landscape, right, front

– PL_LRB: Landscape, right, back

– PL_LLF: Landscape, left, front

– PL_LLB: Landscape, left, back

print(‘Orientation: ‘, end=”)
if orientation == adafruit_mma8451.PL_PUF:
print(‘Portrait, up, front’)
elif orientation == adafruit_mma8451.PL_PUB:
print(‘Portrait, up, back’)
elif orientation == adafruit_mma8451.PL_PDF:
print(‘Portrait, down, front’)
elif orientation == adafruit_mma8451.PL_PDB:
print(‘Portrait, down, back’)
elif orientation == adafruit_mma8451.PL_LRF:
print(‘Landscape, right, front’)
elif orientation == adafruit_mma8451.PL_LRB:
print(‘Landscape, right, back’)
elif orientation == adafruit_mma8451.PL_LLF:
print(‘Landscape, left, front’)
elif orientation == adafruit_mma8451.PL_LLB:
print(‘Landscape, left, back’)

 

And here is the code for sending data from Pi to Max using wifi:

import socket

from time import sleep

from time import time

 

host = ‘….’

port = 5560

 

def setupSocket():

s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)

s.connect((host, port))

return s

]]>
2521
Project 2: Body Controlled Sounds and Visuals https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/project-2-body-controlled-sounds-and-visuals/ Mon, 10 Dec 2018 06:23:50 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2512 For Project 2 I wanted to use the Kinect in some way and expand upon my previous project of controlling sound and visuals. My original goal was to have motion generating sound with a basic human outline on the screen and including lighting from the overhead lights in the classroom. However, I ended up changing my idea, and got rid of the lighting idea, since I wanted to learn more about creating cool visuals in Max, getting inspired by some really aesthetic visualizers on YouTube.

For the project, I used the dp.kinect2 object, along with the starter patch shared in class for the Kinect to help with getting the Kinect data. I wanted to learn more about particle systems since the visuals created with them always look super cool, so I added a hand controlled particle system as a visual, with some help from a YouTube tutorial. At this point everything visual was very fluid, so I wanted something stationary in the image as well, so I used jit.gl.mesh and jit.gl.gridshape to create a stationary shape that makes it seem like you’re in a spiral.

For the sounds, I wanted them to be simple, to contrast the visuals, but also controllable. I ended up having both hands controlling the frequencies of two different sounds, each going to a different channel. I mapped the coordinates of the hands to reasonable frequencies, and fiddled around in order to have controlling the pitch on each hand be pretty reasonable. I played around with using the head to create a tremolo effect, but I didn’t like the sound created, so I scrapped it.

Having done this, I wanted to add more to the visuals, so I had the colors of the particle system and the color of the shape change with the sound. I had different components of the sound controlling the RGB values of the particle system, and had the same components plus the position of the head control the color of the shape.

Here’s a video of how it works:

 

And here’s the patch:

]]>
2512
Robert Keller Project 2 – Project sPIral https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/robert-keller-project-2-project-spiral/ Mon, 10 Dec 2018 06:14:29 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2506 Hello! For our final project, we created an installation using the Media Lab’s 8 speakers, a raspberry Pi, an accelerometer, DMX floorlights, and a ROLI Seaboard. The idea of our project was to play audio around the room on the speakers, and use lights to cue the listener on where the audio is actually coming from. We used 3 distinct “voices” which could be any audio file. These voices rotate around the room in a lissajous pattern. The position of the voices and additional audio-synthesis can be controlled with the ROLI seaboard and an accelerometer that’s been hooked up to a raspberry pi. As a group member, I assisted with lighting, and helped to get the raspberry pi operational. My main role was creating a max patch that incoorporated the accelerometer data into our project’s audio synthesis and provided control signals to the lights and the speakers. The patch implements granular synthesis, altered playback speed, downsampling, spectral delay, and band filtering to create a unique slew of sounds that will ultimately be played on our 8 speakers. This project was an excellent exercise in debugging, control flow, and sound synthesis.  Below is a demonstration of our system in action. You should be able to hear 3 unique “voices”, one of which is controlled by the raspberry PI:

Below is a video of me detailing my max patch extensively, and showing how to use it (save for the subpatches which can be viewed in the attached zip file at the bottom, also, audio is included in the video):

Below is a gist of the main patch:

Finally, I’ve attached a zip file with all the files you’ll need to use the patch:

seaboardfinal

]]>
2506
Project 2 – Particle System Tracking https://courses.ideate.cmu.edu/18-090/f2018/2018/12/10/project-2-particle-system-tracking/ Mon, 10 Dec 2018 06:12:29 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2509 For this project, I wanted to work with particle systems and attractors within it.

First, I used a few objects from the computer vision package to track a face from the incoming camera and get points from it with edge detection. I did this with cv.jit.faces to get the bounding box of a face to track, then used cv.jit.canny to extract binary edges, then cv.jit.features2track to collect the easiest points in the window to track. This works best with binary edges because of the contrast between black and white pixels.

I then separated the points into two lists: x-coordinates and y-coordinates, normalized them, and sent them to the particle system subpatcher. There are three modes the user can choose from:

0: the system as a whole attracts to several different points. For example, if there are 50 attractor points, 1/5 of the particles will attract to each point.

1: all particles jump between attractors at banged intervals.

2: particles follow the center of mass of the tracked face.

Once the list of x and y coordinates are sent, I filled a 250-cell (number of particles) matrix with the coordinates (repeatedly, so that an even number of cells has each coordinate). These were then used as another input into the jit.gen objects which calculated the force. A jit.gen object basically transforms every cell in the incoming matrices simultaneously. So a cell in the particle system (inlet 1) could take the corresponding x,y coordinate of it’s attractor in the second matrix (inlet 2).

I used the Amazing Max Stuff Particle System method of calculating new positions through f=m*a. I definitely spent a lot of time trying to understand how jit.gen works, since it is much more efficient at manipulating matrices. I had a lot of trouble figuring out how to use the x,y coordinate matrix of point positions in jit.gen. After understanding that the jit.gen object basically works on every cell of each input matrix, it became much simpler to just gibe the position matrix as another input. Before figuring that out, I tried a lot of different things using swiz and vec that involved pulling the matrices apart in the jit.gen object and trying to put them back together afterwards. This, of course, was very tedious and didn’t work well. I had to find a way to use jit.gen because before that, the patch used way too much processing power and the particles were not moving smoothly. I also attempted some javascript, but found it to be difficult to integrate with max objects.

You can also change a lot of parameters such as speed, level of attraction, rate of position change, etc. throughout the patch.

I originally wanted to have the particles attract to facial points so that you could actually recognize a face, but I think this would take outside hardware or more sophisticated algorithms for facial point detection. Even so, I definitely enjoyed the finished product.

Below I’ve uploaded three videos, one for each mode. When you first open the patch, the video and facial detection automatically start. To start the particle system, you toggle it on manually. For all modes, you can change rate at which the particles attract/retract in the particle system subpatcher. It is set automatically on a metronome.

Mode 0:

In this mode, you can see the particle system as it attracts towards the facial points. Moving your face also moves the system as a whole.

Mode 1:

In this mode, the system jumps between facial points on a metronome.

Mode 2:

In this mode, the system follows the center of mass of the facial points. You can see how they follow my face as I move.

Here’s the code for the patch:


----------begin_max5_patcher----------
11451.3oc68ktjiiqbt+dlmB3JlaD1yoFcvNHOw0N5q+ieFbLiiJTIwtZMiJ
o5Jop2b32cCfDjhThK.bSKM6yYppH0BQ9gDIRf7KS7e+y+zCOu8qI6e.8OP+
N5m9o+6e9m9I6sL23mbW+SO757utX878121CK195qIaN7vivqcH4qGf6ud0h
+Bc3Sq1iNr8kWVmn+EZ+g46NnuYB5M8erZg9t6+19CIul9wWuZSxhsuuw9cP
c27s4GV7oUad4ocIKN.sMINdF9QDCa9IQRrWDMCi9ubelMu+5pMqSNXajD2M
Wsz1z197e9aQOb7Mt88CouSr6t6O7s0I12b566ia2bX+puauIgOCat6+yO+y
le7nmP0ljunezmgTKSVO+aHAF+P0xKQvlQEOhnXlQTEfbGWk.SOWfokKvjJE
X3Mc3auk.MgGdd9lWd.8e0BAuJcDcS44jcnseD8w4KR1iVlbPKwIKqAHnR4L
goKOxh.DaOOQEPOux6d9dTR2krO4.54ugrCKzhTc80Dks6kF65k41qvAHizn
toeSak9ckcya285705uazhsa2sb0l4Gzc1y2rDsOQ+CscgSrFfrnRxtPsJvh
EFnRYzGLZE1ewXAAbxK.v8Zx98yeI4LfqNKBBsBgRXr5gOZQfJCwhPDqy1Dd
H6oU.Fj8o8wuuFsKotALJpwjfJhk1cGpowHR4.AM.f3QKXLhRMU0UoFe8I0+
4pCyVt50Wm+F5Cq174Ds2BDTcCCnLv3nd5f33XkTCB3Flbrrg8JUmwBcS+oW
meX2pu16nxqIG1ssAODnBvinnXiZgfDt9fL9JzEgEuuam99nW2tLoNqgQQfz
a66UwgOio7ZZFS5+.8wsqWu8KnE5W9nKRnWMexpgANEz8E7YhzgCwpvACw0D
XP9Gn+78WeC8bxgujjrA811U5mkGn.EbCfQaA.Huh..7+.M+vgcyWbv3qzqu
u9vp2z9JUDG70GIG3DEmCaByEIA+BfMUXY7MzhOsc69jmZv9.Su3PNQJERiz
ZEZkH7oIDUrJJt+lHWY5fO8WGciJ0yW.IbPgFAWsNQOa39Ua2jqs8SOL+s2x
c6eJ2Gwfe+4V6Wj5wrasZCbKV1s1k74UoedQ1cmuSiflEi89NPd9ZliwluFM
Xuay6qrME3l5dxeN8KLynLnigsqeQxs5c7nbiC0c5urd6h+JYYN8GMl8VxlU
adyrloMGlev03xd4kIebtdLvSmpUU1qaLZV4Gdy7WAg6+2tUyWmJdO7xtUK2
twzHJ.0lam93LKHGLzlWXruiMyeqjOrV6QCKU7h60B466ed9NSOwyfBDM8EO
rc65huT1macxGO3d42VsYyIn3gsuU8Kta0KeplO6ya0u3q08caek8O89F3Ue
ROP8vS6m+4hn8g4qW6F4V7q+qy2rR6rTxgUPW.Em8hIalqEzOsewN8LfEjW3
U9bIuxRsR7hjurZ4gOYeP4UFzu8UukpD8PVu7xUujr+Pw6cX9K6KdmyFPqu0
6O6Fj9jdopusVKEEeCE1Bs7iHyaiqv8OwVGXyHa3VEVzHQoSxIR2HLV9wVUY
PqpMFpZK5UfDl6lNk0o2e0lkIVQkmdOq8dmkhgDWHQvlARxsAgsBWHCJtv5Q
bo31GTbBxegz.d41JQraKmLJSTZcvEsB0HlnZ.izHfU1NKLRPDtADRqPY1nE
RTmgHJ4NEhn5IB6GHhDemBQDdbeAQpaTHhznoaqmh8ADItWgHhruzhXWwPDD
wP+TWzS7X+EO8mg6AD4DXemdgAGz92ANbd5L9cFqLK56B.WfpSO.W33e.fKW
.U5AzRcaiVcwbky4xtasBeMaspvdSUDf1mrFgQDD0OfhQrPDr1k5QJdUHUM9
XxCGor6n+im+69EAshkWPDG66x5vU.P7d2CyPV2GYrWOL.ace4vTAdPWP7nC
Lt4C6CjYX2p.ZIHywMczru+UraSVg175kCY6299tEopwYCMPEAfkI6OXHrga
+y98bK127N8pSK31Q52dysCi0tSdma2sD1+bxktwc1660UKcAsIc+wM9xqcq
vPsKFrSdvkkKQ3gQhnDekHBoIYJl.7TyPADSzUXyRu5JUjvwAJRT6upQjFH8
NRb+IRTJ3VqKJFohjXj6kBPjTdKRVYAFZcA5jT82PIBD+rzNozgRidmjp+5j
JJRocRhwtSRzicRGsvIJXuSLTSNZ13l9xNPNiYhBl1FtVuuFlICoCFXuM7HF
11g2isLdeOfsCV+MF+noWQd6vClFE16o5oDO8HSxrQ7wMwOb4nZy0JSTurP0
7X7X5rHFiQiNJYtIKuTRFwKIS4i0KhT+O9QIyMCy0celHr9LNHYvbKvki57k
AzmwBqOyIYv7NWJIyO6dLesc.xT9q7Wjb1HSIS0CFB0r7If7POY3.3pme+.r
b+7rCKHRt7x5sOOesiBKYTfpNFubjVL+7w1aOkhCjl4woVqxt2LwmtG7MSoY
dzPkgC7dMQOn0QsYbrcDCUdDJ3s.JX2FPA1SVb1AnfEecAEUxs0WRN7jMkop
isyLJLymThyH8srgzAgeNpPTrNysUafAdLKsPtGozJvcXCYNLTHVALIlQm3z
5DmVm3zZMgjoXtMbZBPq0dVpspg1twlb3lAiOZr9gRVoudGZ8p8G1aRBFvgK
z1cHSfHQZKPnU5O5mWs784q0e7j84eBExIBV8gChCYPjyUTWvFUsHbPh3fiF
zXG5Kla+2frmtCg9ZfiIHaziIHj2bcGX3widHAGTfQx6IbI5lIHx9vIAm9BU
D0QVIvk8JoDb4g4kZrlOPmiRUXQWQN1kF4niCUhJjP3MXYmJoGIKSqXbEuFK
6zVvRlGGQRWoQp0a29Wu+VSfEyEuC6nWXLbq.KV7UNXUGG9T3Y5kPJn7J8bh
1.Cihr5bbQ9BQQqfQ0sII+7UeS.wllAYjcWz2DW6CNsE1GOgCmu17lgiJbpf
waOCaaIbkqPGL75W1Ah5UA0TJlvgj5lA6LFTTrpGPEU.nzdkCooI4r6+5cFJ
6k1lCbnRkuVopRaibeqsYrlkrwSzDl5jE2ZSYznqbSYds7HtvUV1DckKopal
rws17BPuPQULsncjP7ofxkm4SASzF8Kw8sOETBte7ofxtG7oHEND7NZkmRm7
oHOfBghsC9TTWh8dq5Sga2Kbq1tCZajnIeJPjXR+3SAQbSu2EMuKORQNWY6v
l7PH2x.0LulfLpm1iGxM7FhALIvDnrEECPV0aJFwvlfGQQsW4ZHSytSnTPQX
oRpETE8BN4iWFMCpfpAUS2flobPUzNn.0CNk9A.gCnfkRWI0RJNYkFkSAAOn
gfOTQnQ5H3IkDpgVB9QMgFnmPiTTnQZJz.UEZltBMRYAOnsfOTWHD5KTCEFZ
jFC0Skg5oyP8TZnVZMTE0FJmdCUaToYZNTNUGN07voFjO60aHZXU5MIElcRT
ylHTaXwNK57Ob9Grbqv0.Z9Ekr519gJhVVgo8JN0WWQ7yIaRw4CMLKwMangT
Il4J9FhfO7obEn7S+vUs6EM3MK3wgiOI4qylMt9fyhHW88l3p6MGAD+LmOJB
3lx5792VsdsGCCfh2aLr+Orxb+HHXiUOrQa+ffSbga.Q2y1CxpWPJrRTfES9
ZBoFzi1Zkt1XafLBXouViA0PNsmrESFLaw4p63syhL4xae3yIe8scneYEA8q
HhGcNjHfeCJ9wckJOamC2JAav5eNuWIum0wyvWd3+u6EnmVU0sKLDJlOMi4z
pwbY7fg4v1Ad4g8S2Dw5gVhzVAHfHg2I0Yo77GZs6oX+OyX4tHN98AmGlpSB
UEhhXHNRfjHEJBEqcDDQH5k7hHLDgq8cS2wfHJsQGcmhV+GQ0eFJhxPTt1ac
DUhnJDMROlPOGAhQPL8WICw3Hl.wjHlBwhPrXDGi3D8bJZ+Zz8xHt.wkZiXH
dDhGiDXjfnm6FIXHgtAIPBodE+ZWHQhXjDqsygjTjjoW3udQ+HoDIUHYDRFi
TXjhfTTjhgTbjRfTZwQgTQZcKTDFEQPQTTDCEwQQBTjDEok1HTTLJFihInXJ
Jlgh4nXg14KTrBEqACCZngCrFOvZ.AyLkya8+owDrFTvZTAqgEr98YgM86y.
bDVstOyZdDAG1DXl0XivcJezEqMB5OxV3+9ZzK61VXyRqbtU3zSxsGSBOmas
NjmLztfesi9u+8Ud.7RWprAkkJd2wcb83Nqiq+OW0f6ZY91FGDTLJTMLcLrS
3fqlcpmn.qNttFATb6TOqXhcdORY4XcI8Kmkq0xyJTA0jt0hyx37JRg+fTlF
mlNtz2bVSmdk2zqA0IiaSWFPSOUNG0VnfFPKjwtDsPRHc+Tn6ebag3SqZFMY
VXzwvPgvwtAxXAzBuHMvPZepKPCjpBwTi7BzBCYbbyFZx8h4izbSkajSCydq
BXW8kej5hd2Ikgjhx0vRfhxheQMzh1QdhxidgWDBSc6R1DewJhTzOX00Nau7
JaHk.Mk5b1PdoSFR90GilDz9gPSb9Dil5cFMovxIFMMwnoIFMMwnoIFMMwno
IFMMwnoIFMMwnoIFMMwnoIFMMwnoIFMMwnoaAFMQBkRSDJwcDnLQooIJMMQo
oIJMMQooIJMMQooIJMMQooIJMMQooIJMMQooIJMMvzz48MuMewegXdQfhXaj
KTj1WR050B.VtMI9w5uZ7YyCIpWHyCoW4aRE6pd6Jp9Cx4OtuGJmr3A7rvTE
RifT2ABGONB1Q4X64yDKhe7xxOR3nWVIR3oDQwP0UTxytZLOoE89LSMZrOa3
89bEl0DRSTVr0nyXO6wEQGubLwZekHONRKU77iFbAcotQCW19HB1y9nTIhvt
Piu4ReGfSarSBHIICCUVPX3t8pwToyaAxiC+1TARBmCwwYWMl5bFli5uRWsB
jjjWmKVjWkaXZ69dzPyarsyfCGcWbPn7rqFUkKekmlO62cxivphAFCrWLppV
XOEGQycOPOBKlmq+wd0PoZ4YSuQOsnJXbf1uayrkbXzNb4npa46PEtuRDECG
A45Ebkc4XpdwhCYzOYfTTX9Nos4DjodfUPEP5l.7rI2Uiohh2BDswCA8TABl
XfKoYWMp5I7PVjwPYPoDGbpnUvGxVAID+w3drfKF3rrfiytZfZ6krloJZ6jA
DAod6yQiN.5b4iCqUMMYyLWMpGz79JPzFWCdp.4N3qijWDCXTZHK0cvzSNCr
p1AhZ0RTQyhX5+IMKiiXmxkwgYfsWNp5JdJTpfDJvOhzsjhgGYOi7UcQ345e
XJvuH2RvsWdMJOkrxuZEn7WMlJcDe8yiRGvwyjnPV3Oudyj7bybRAbcHm4j3
8FKEMjHnu9dvGzVANj0XOTqZvWnnZM5iQ4vFikJByQVD+9SWH.w3bQtwvS4D
sZYIAo4gWee8gU4MRUlXm80CAy6ee86I4iLTZhoVNnLewB8Cew10PZ996H7L
FQI4JsHOSJ3wXp4uhM2hn6JJtV+ASH9O1kjroSRA1zrElo2Xl+hHijT03I.+
mIqWu8K9KA4Yu1uaS9XQLFGUP6rDoLhJH1tJUDIhX9C8mSvE8lf59NR+BZJb
4NIpxvjWVHxaH73kD9xSCKNLhz9Snqw1LxEuVnYUSnuOF16WSNraKRfSIRZo
D0FCGy4QVOsOmtvkRRxBAtklpXTdXaOWlqLgTGWImwhlwIRoP1EwWcIkds95
gsapSFgCsK2QADO8mkKcjyktqw910amuDHtbkhMQvfPIBogZT88pkH2wWwpz
XzGlu3vpOmjk1WUUjUxPf33vGViuVg.huP.mCaQtFyZIHHuVw.puXfxEfi1h
.hqODXO5aOYSz+80Y3KJxFzvHnvKI3AaBfDWwH.74xdeJbesYgK07VDPPs1H
bQwW.gCxS3+4+4e4iDzeG8Kej9uf9Uj18v+Ezuo8wC8g8Klud9NSQ0oVc5r9
VlZFWOPfxeLMkhCyDWVdt0ds6dUy1QQQLpN2VbDRRZ1c23X8hYzJ.hvU.vpx
kcp+xt6XI83wQZfnPwb6rP88vt7glG.vfJIFKb6Zwpqqt9dZnQlkgNNzHRdg
fGCSce6Kq1rb6WpaL.CNLP4oozdZQZweG7H0q9WehZ8P9RvUt1kao5m0nZid
R9jOuTLHF5VY4N8mCxDvwo.pI6xau0gSN2c66U1j4bWjpkqsIKihuD91Tisu
MlRxVsS+GIbhNTsXBdzMqWFc6doRxUw9wjHT1CMfwSOu88MK8ARf5NDnRTMj
vJARpvQWl+PxIDwun8hhU7vzsVphJcXYU4vz8WqjpaXIU1vxqpg0WQCKqZFl
sYa4OWVgpTTrHWQLjGkmy+kV.Can3E1TgKr1hVnGErvJJVgMWnBqoHEVaAJr
1hSXMElv5KJg0VPBanXD1TgHz2hPXEEfvZK9fUW3AqtnCVcAGLr8j8rhLXYa
CaSEWvyKrfglLMmWV6JVR6N7ojbEvt8nOtcGRqnq0gQG1oWWf1hGxZHb9tuk
+KopRaW461K8XgYqhxZWWqCw3xSWmdLsjNqfpU5pDSqJOw9lYRUHprfE0PR3
H1XW7kwVh7xwcLesnWG4q0XCetJWF9pJa25gbbqsXVA+zqTmSIyQLgycU26g
hD4fTyuGmjPcex5BEpsZPJBrSsPALoU0uahj1qUJ9zJEyngV+aHrmXkJ87xS
zVrRD2qZUG2Pr9BlNsPzUGb.0eHAt0otLgGW7QUaomqeNDBF0Ah6zMgDTgkY
2DnB7PUoZ+3Qt319TtPurwlz.Y.UBnL0w4HaoFXMl4YsDr5c.67RS3QDSDMy
rJRswIp9GZaSU5vNqYEPqSFPFO.U013Vn.RU8sKGiit2hOOyn9YGrNa4t4eo
oxQ.Epyobaw3Ll05YELYGvP4g63Cc9gZDX+yjxVOzkQ5UybW7haAfKc1ee1U
xxkni8pvc.nIyJEcaU3C6xvIiMv3Lryw3tBL7AEXniUAQw+7YibMjOaieA2f
Q7LYYr9Z1PoBHR.SLRgfUJ.C94uhfiRS5pzqtzXNo4LLgvfp1B05CJifytZT
yXFuyrvAMiwTWCYHJMjATCV5QH4AlDxCTyvWzHqAOP4uSr2siAMGn3dWFrp1
ZK9B2zJCLG5LXi6cBXwiGz9ufFVcoS9oZ7b39M2PphoITyICPVMFuznHVG8C
gHk6100PHeBUbcw7t8HCiDNTG4RHQQyjXLNNxTza.+1UsfysWB9DmKnCMKat
CdJYvLJhR6LaROIBC8jV9uv7U3IwpiZzgwiXV70kJMrA25Vx9DukdXOeTpvY
Qc2IRb661aeJBEofrBRZ+kLb5SRnxqS9C1.2ocYDhRktUxAmueWYTm1m7Ai.
zwkIZcmc7UXm89jMKAtRiVsbchc.eCV6Ab.T50dQFduOCec06a1g6WVOag1c
pcyQexjpxTzGda69UFmxL4RAhWmY.scOiMPl5nwuVL0NizYifl8z8k0OYDLC
6LasIwCae4k00Z.fRyD4HRVRjzBFTSx1V6.nPNIrDLou0U9x1cqWlom7w8ym
qWVxGL1LM4InY+XQe.FSkoCYS3PyAo0G9n1VyACO7zeDs519jmrYvtUKCaxQ
ey23qIad+446pMIdx2GPnrrNAdTKRkGFqybWN+gATO6csMgEqylD1r6bN4mH
Nl2JMkL1kBE7KfO1ugzZ+GVsXcxS6+19CIuZIsdy9dPY.MLwrVHnBuEz6PJn
KADCRCHBKx96XhZhC5SbPehC5sis.KSVO+aFzpg.WJcGv81UNnhK6vUyK9UT
W.caOUNer7xPStCQsgkyEmrXjF.QHakvMigUcJiHuYwvBto1.Lovkmhe9iSr
1ynwqABfVWpi7wcaeExaj8OhVrKYtYqerbCwlAIa+Hxr80yWbPegMRo1amLe
wmxbaosjSifU4WNITGB3p1v3OB9RlPI0gufG9KSzSutbORubfOmncrX0guYf
1FQPZSIoBA1LFZWSHmiIz+UGBlAXupaQqda8pFYRJAxoY2tSQbzRnUnh3ZEU
biJMKv7v1uLWO7EYJIZI1grEx.9f0pjQL3Lnk0csJ90J98uoWwsCC+iM+e0W
j75pCn1BYJrLW1z2MHidCoxAbmAkaRj1ifLHgf6J.FGeIwOuRnImeIzitjzR
GS3W64yTHvAQD2U3.e6ldWYqexyEPQfyXvyqcO9yP836wUPAE3rH+wQHqSDx
1ii3aVbrtbPQaAFKaJcVcz3vkXIj1meShq3zKoNT52zvDwSMMGW1IsVSifGv
798Rok4I5QE4NnPa2NFcSBdeb8Vca8Aub+zcVz0gj2TDcoxcyBf4tWmakL4H
oEZV4zNKO8P+RS5iBHFatUG4BESar4IjWyoTmO4LLgQDomQFcSuiqtGxYXuv
LrK+H5Nlwtyyy5SKTqUmjXPxPAwyS5QVVWUZ1ciljq6NspuVEREK5IfRdaBT
1xCooFoRaBnnQ4LxKZuQ95RC3qrDZsFb66qQqS13oksn3iA3qcXVL81sXG7l
1pkwQhFsYAqVJcZ.RGPKxnjbvmP.jhvSkDAoJxfbxGuLRgTAwPplbHMSPjpH
IRAhhbBYQXtxDKAJE9vJdkxSBSU4DFwCRi3CwQZj7HdRfjZHQheDIoAxjzHg
RZjTIMPrjlIWRiDLwCRl3CQSBgrI0P3jFIcR8DOodxmTOATpkDJUQDkxIiR0
lWZlTJkSLkSMObpg4yd8lxg9JyGoHag9PxjUWhFp0i7yLGSd37OY4gfnFXql
bquhBOPgY1JN6VWAyyls6bmpbLF.l5yM8AhfrDdkQQeX4pWMm2292gHfxSMU
TpaDA0iHquCg5cGh+diMfcFmEQkl.RVMQUIHbjWONRZONdxxJurpxZGYVehh
rSM1ePmCIIioXp0UsWxfo8Nd5rMf55UWnsMfdY212eyGL1soSQjiIhTyPLsF
S1X1c.H6sgAWw6.pUS8fcAJld+aY38uuxyIvbvqoRhjkgbcT8DWO9x5H9lK4
MJR8vADtOmmImPD1jMub3SVNxATLY8p893OGKFpLMtMY0s8pcwgNhJt0dzcU
Xa05q3qyeKfouHVXSI5rtKQcOXZ87PGcRVhS7QyTYwVJPYyxq2rAhsjAyt6H
Yy8z.lTI1QgJ.MIdlggLDZUANIrQ1Rw4O6ZihR+q.e0LAGbBR8KqHn+O5eRK
8HipduxjPknCHcaDt61Nj2552mVhHqNNphlJp90FZlyC5.cHQtq08ovW31YM
oGga10FbSqAtKtatmUBFOuWnr5oTI8EmUWkjmVWkR2xwxJvSmTakBRMosMPd
.MP4En8YUr7tEZmN6BzFogzMiK+ceZYXDJsDQTnX5jcwYetyKTZWgBs0IXuE
ZVD3qCTGIrWUsXiGWwFGl93kXLcUZXUoOhuHsQySk5cajcAZiUozVQesOJ3N
UZtjlSA2d0nKajf0iIicaTJBxDiW8.L2QBiPNyTURvRiAFUwaM5xIIXSojF0
zbUhNkqrMwxtZrkt.jMdiixy8h4YqPSUaxSopQqB5a8Uex5h.7IUgxhx0PSF
mu5CYbbr8pG3hSjXhKNCHWbL02mimyDDrqJRHilHiyDYblHiysOYbhf50Adh
LNcmLNesOHiCTdSYtSlKwDYbZKYbfCa8HwDYbBlLNes0jwA.cXidmHiy.RFG
J8Xj4lXjSaYjCYhQNCIibDSLx4RvHG3XalDymXjyDibtJYjCchQNskQNzIF4
LwHm6QF4vSOYhlXjyDiblXjyDib9QmQNvw9COGibLWbmyHGNvAGUAF4nlXjy
DibtQXjyDuZFSd0HImwqlraMwqlId0buvqF+pGPtpCmqd.09Znj8PH9Vsd.s
qv4CakK.BGMiEq+G2Xx.3EYqqNW7q3hy0oqHrzMtPu1uTvfDU8RB8BLF5i3j
JV7WIqydnqgkqSl2DQ2bmIZJxLSARFy5zvRJ8JVOq15X+hEIqS1Ymm.829WM
GnIKR9iMYmyD5ak+s7GaxNJ+zuR565O1T3q4W+WQ3JK887FF7KsqpRRfdGnS
R1lyqhK5QGPC0cOW36Swum1tdY.1AjPY51cls0tZaK8dnT7YOvRS1f9vgU5F
TFbF.TJf3fRnsmmqhZJ4brtBkmArSDeEQHfdO3KkdAEL3n+hcJePSVtxbBi9
zyuXO3gfOMdV7i49AYhsrSrkchsrWorksgnB9qlwv9DAKW7rinGO4x6TrYiu
wC8WS7awbr1YsYt+QzNzKnm8IDqLaprwhw8E2V3xaXps7O8adEXZBre5Ln.d
Qi5rpISbqS6hlXBvB3DB2GvkCDFBNnL7s7IxpAb422fqtUgnA.r.agXpNq01
gjS3JvboFUOdTVVbAyn+FRu3txVZRkXKX.URT4yyfNYGkeCaG0nR5igToHGp
0CZjDwsrFYdUvGQ+466Of9J5anue5GnpSkvJUNgcd.NkZiJ8fILHTN5VFjSG
T+3eroGQX2wYctShsNgvpa3w9V+6U9XyTBG5uvIPDm14Q+24NQsZiW1TcmUa
4B6PyfJt0YLzc.nR8WWMtuPU58Mp927VOkB0Lndf65rKDjNZbeLN.JCvJmiH
WHRdETKmWeKebYkWHTsgcAHCDSDJzRG6VH+ZmxtgPJs3qogUAwlNw0zvpP3S
4kfAxj.G1G.6rtliWMiQyBxJOBJQMTZ6iWM9lId09cH0xOFB5HuqzRUc3W9i
4AU6xCMe.0lCmcTJoCj7geMeRO5kZGGNCGSOaj6fJG4GRUNCuWmuvtutMp5k
h0c9rQlEe6d.i5l3HkzXdLwAgQxFxpz3UZF0Rb0Bh1nsRhuaX6z5YZ8yOg9T
x50aonOLeygUyWuZ9d8xE9vxcy+xS1Th0UUB9f82OY3Bh40sQzU+Gl+GdlB8
A8P0MKQyWtz8mtgu1uqj2N7ozqwm9Ms+0sa0u79OMeot2ybmOtU+K7r.5bif
HDGK8neMth90n6PVr89aKMjqvefTBg9fP5.qxUCKPNwZMjD79k.mkeRnrtIw
zIRqMQZsIRqcOQZMeXKPpCQdcJdAtramlLh08JFD+VNlqe8wu832sEdIis6s
uuuRnL3.uBLAGHP.A28XaSH2CA21GMTL8H40DcWC8VthVYWJLZ4AiwNFyqhZ
ECp217bk8ttDOP08c7.+UevTfpeDYbOEPP4TLVIB9LkEGY8DnNwwhB3pbhkE
8HKKROrzkSrrnOYYYp1JG10ldfmkCh2niFQJTAVilF6f5KCL5nic6Szq0thK
azkkW4QWVbMFc4cnWSNraaSa2oPMivcUAgbYre4kYSupEBD4saTU1rc09DDK
6TLvTjxKV9mKcYZPA2GpEOlL3n0gSgeGts61BeP.YNNGpGlDEalzUlipGLYU
.lzosdef25cJy0U41ZGnR1EKm168o8deZu2+gZu2yQfiCa+x7cK2iVne+dUl
uUtMmyU8U5iDb9VdChe07F8Yyggkf61d8d.0n2xYzjWYxHblv5Vbc22nM9c9
Ns8a9CpDfJp8Anxm19RiQQli7IzdZi13r6abEBPDLoi2Ex.ghjCkIjtGkH9T
Ziklps79yPK4GEc2c1R.qwqRrzCjNBBmdp9Ki1c8W7OBPcN2UwyHLuXKBD2C
PmtBRyFDRyhmB8QdqEwrdJxGrnaXGY+aZERgOwgSBDow8qdn9FcuyuADcV.3
pizuwcGWkSwM9HtlVp35t+r260iq+t+lNYw80gKGMZhjSofZuwGG5c9D8+Sd
opp34F+2G0Rh6bGUMNBfoAfrb6hW47tOiEYxHPJn51QfdPc8NeeAfStW+QVG
GR4rtqtxl1EV2bVDVesKrrocgEpoTvoKSuUSoT28vpWalBENttoyHFVen5Gr
UNUfdy.XXCAiYY.bOTjdo249Er+Kq9Nxq7xQXqRuh9ZFLJ6G.f8qdkyNTKKn
6OjUbmWP4skFUaZ6Ypi.9n6BSk0aTsffuoK0rAjunT2bTDaoy.SRqhrciyE2
1zzmGDOyI9b7OJgx3ImZswRThrqF6iI5PpJd7KQIPiGR0kyJNicZHvog1BGc
LLn5HWzknEhCUObzqvkAU7Pw9XEfiirqgENDXIwvtwbANDXYgjIRlsO2iC3V
YLbzjHyYgyd0XKagjESo.wUbsS8RjnPAU6TYrqpZSZPk80XuNL6gSYR2QVT9
qtVJqorPlyjQulpHqg2zIWMpZg3G.6hTggIgNG1n2BwgNbcr8CfFG5bkidZm
FE55UF8VnJzpE9n2KGhWCzqqZZcPM8qpIqohVTi1avOzX5LpI5FZOOM6Rbzi
oaLe1cFaUKVnBYiNayjVBElWF4WVYjz8SZgSkwzvonWIgSHctdc7VWwU+7.J
hs4STXJiUCUb8JgquaJessKw0yilcO60G+ZFaVS77s37H9V13zbnbyGDJUcZ
Mb1D0mYekEOKJUY0sQFbbwaU3ye9rh0p4LXR1YNqWxjd8ZKKxyVVbQGx501f
zy1PT0fC4h1vH7PTHEPbfhsSSXtXL6uE91gKjMISJhkgFLf4KBPdrWMP5IBY
6GFcRaOs05x+4b8N1aMTB.MDKbjgpUP7UEfNfC5Em4CYUXgQikU6fq3il1iH
1s1kGKlQU7i2ZnjBruXIY.6Q49NnlKGvdTtr2l+Nhk0gJ41vQje5a6sFJgf6
oPvFRjz2QnrgbZYtu51zAsUzZmCNcoZDUlFD3RXdy9isGg7dy0h7xEvrWmqE
lKFSWKXw81zybnzFvTJHJF4ldVoFrw9mGvwJU3IMIADZj06HWVtqWQX5UioV
l2RDgG6qDA5XDFENTmGakLo28QMpkECTLVBSvfI7zqtNEnHgmBDXcKloRuXT
U4X9JOC4Ln91H3MNPV5hjZrETcFlrWMPM8xTbqnwyZbLqSolPg8.I+kioRt+
xDnkSBPnhANcTiLQFHYh3sL0nH4lefRr+JVIytZvzx7UIiOfCSKylV8wrrFi
ef8tTHL2UCViOD+cndz14XErqD7rqFp1tuy5vFRqzTumq.OjsBZX6AZMyU3F
DGcbmlyuQzY2ubyTzKqzIn9IdBBaVLFiomHdGu+nZE1awC6m3w0N.lFYFS7J
SEui2eTm3z20fSFxQHDuWYWzP1J7dwLC49QPj8k0BhTNS5p.3TrrbaFY2eLG
T4sL17fp7BoqxtjeTk8Vi4.JhuNbHZzgiHJrxfH6BhcQCwd0Po5w81oM0XGG
Q+aafey06NrayERW1hI4ztDt3S3g3gWshjymNJrYCl8rK8T.lJGtcylPCI7D
CldKtuWMNApNTBHTpEuhBE66zUpeZQ+dv0Y7VXwgMF8HkKr73nBNWjUsz+SW
opVqjcjsGlZvWhFLKgHHO756qOrJ+CuLzH6qGXEy+952Sxe5LkV31KGqluvT
mEKTG50CDjbkoNzKE7XL0VQ5M2h3JK8GAhASH9O1kjroSRA1zrELy+L+EQFI
opwS.9OMmb1eweIHeo4+2sEmeg1w8nBphkHkQTAw1UohHQDyen+bBtn2DT22
Q5WvClhO+xmfhvus3bt542O.J94jnJK76kUz2e3k0aed9Z2XiLb7gBCsJdqi
kH9e93HR6OgtFayHGWxflUMbHK2IEyVs7frcEtWpLphAGSphHajDDtrx5nQs
RKrHEN8DSSz0xoF14xbIYb8Qhf0ORtg4b6d4Y552ecdcxd1AIoQzU0K5jyEc
Z2E8JN8QJLHR2j5OnYwmmYPmEy2r4anOb3S6R1+osqWZRY7lQJnFeQnjvgJ1
sGT89l2lu3uzFhz++ZvFf0wRv8.3TdKHnICYp73Ho8Xlk1nF3B9igGyL5VrW
qAsXlIvL+SiUfa3v4eaPPFOTHiGnwnGyLJMRiG+Xxbyo2ydpo7E+W0feDR5o
eoqdEyrdrFhE6rCPIugOR.v2nAYVjZ1xcy+RiFtXRKIBgp8TXXEa.wpbVzFW
TqY.C9ETwFByZFd.MmMbSA7Zx98yeI4bTa818049Dj+5xnzy97.Uuxpn0W6C
EqBermaUUCOfBjLWUVKL3I911Rk8jZDQvXzGLEk9OmjUa5pdnGb.7FyCFrxN
KEFBv544adoHfcxIq1v5RwK6l+r4r9x3lJ5CuuY0++28AKig4HIwg6gA8BYF
yeX09wJlfL.Je5dJ4v5ySajBaiaUwYnvllc99uC00KFnuBUG.6EUrbNuaaE1
E2JZapbjGn8OIO.ghbOrJP.VAfhbZhWbRLXG1VpzmVJGJASw7rqNushG71Zg
cO9jMNs0O3HoGOYRTbOn4n39HjhgPJ8+QS56dVEwG.VwNlPBs9Q4SWY9rHo0
OnhjSnVYhU0XKtPYGMI4rYTSh4FAzIlV3dcsg5S6jp5AHg6CjP6CCvbeTn38
Q2L2i4TX8vygEGfDUt1DLKBD6WXV0iQ9s8sqnN2tHX1Pzv7YjNSzGcMh7jkq
pj8n1o6otIQIDErrFHG8MW0GMNRiCu6AKqEK4LUoITGJ3b0C74AVeWt+V.mZ
jr3Y185i6tSeOMnWhAX2tQAQa81Exbldg7zB2qumC0u1MqbOG5lSjLOTwnQj
9PGymIPhv8vSBZuMhmjZ0CfXaDoWkXJ0IXwQPYTI6dctc5S+dTstvCGiZ4al
oN02eMSkOKL7Dc3dR8zumMonVboinsGHMbvOLBXi93s5aCQd1roMaHhZJN1o
MaJ2p.d7d8sgHpzqEpEQpeBwHH6ybCWnvxfMWcoZuXuZufy4vHJ3pdWuvqUV
nH8fmMvih0HzzGKLCdVM4F0ocVmpuGIb3OyEQKf7Z1K6kFXytRUqdsIpQhHF
ivLQPBZjbH+OfK6bizGaF39X0bTdjWaXPsFn3tielHXu93pYwDoTHc2pyMQV
2ahBsUGiCmT4.1N8csJMLAEQBznPquKokbcj.pXSQvr6h928Y+EEZShB9DQA
Wlnj5nRohBd3EE5f3QM0qk6S6kMph50XDUuXxvq8DKpOdR9DFh9PjNckfsx+
a2hCUNxA6zoUvF92CMOR2Zd4r1MXMQZ2VoES2fLyqxMUOGraiBb9Nhk8Taj0
w1namaGj1mWqvoOhPAwOSSzd4Q4klgp2dTM5bGi2KOKuLCxX8xyBGxyphMnk
fgDVmDOS4BCgROKMAiwLU9624Va.QOtaOIZrW3BtWdVpt2GDAaRtomfVVWP1
86bi0qfc1Kab9oqUpt8ZsRuEcqdRZ33haONxOYvw624Fa3af0Y0LAXuWXXZN
lMfoc27uWCbHw8h1LV4qCOUtXEJAh3C1c3YK.e3whY8Sqi30D0U273VycJkc
WbD50PQSuryMuNy2Bpx58fOzsfL7s0SFAW5Ju.REM+s29bxt8tOt8o9vqy+S
HYcTOZub0F3R64J4C6R97pz2uvdm46V7oUGRVXn0Lbh1EAmimO75V8Cdy6qb
hsMgbNIqy.ou5rMqhT+oXZ+jGHqIyxNIalbPbaylLW5E00FXgrFy6VXSYJVO
03JlQXk25ZHKvZYFfEl.n+L+O+7+aFIhsX
-----------end_max5_patcher-----------
]]>
2509
Project 2 – Kinect disco https://courses.ideate.cmu.edu/18-090/f2018/2018/12/09/project-2-kinect-disco/ Mon, 10 Dec 2018 04:09:59 +0000 https://courses.ideate.cmu.edu/18-090/f2018/?p=2502 My second project is centered around kinect with the addition of an audio reactive element.

Based on dpkinect2 and the kinect demo patch, I created a point cloud with data from kinect. A gesture detection was implemented without machine learning: when the person claps their hands (the distance between their hands is smaller than a certain threshold), the point cloud would change its representation (draw_mode) e.g. from points to lines to polygons. To define this gesture, simply set a threshold for distance is not enough, because there should be only one change in representation activated when it is smaller than the threshold, instead of switching multiple times during one clap. Therefore, I incorporated a timer as followed which detects the time between the initial clap and the separation of hands, so that one clap will only trigger one bang.

In addition, the x, y coordinates of the left and right hands changes the color of the point cloud. Initially, I also tried to adjust the camera position to get multiple angles of the point cloud. However, I figured that there will always be a “point of origin” which looked fine when draw_mode is points but turned out deformed when draw_mode is lines or polygon as the point cloud drifted away. Unfortunately, after experimenting for a couple days, I still could not find a way around it and decided to center the point cloud.

As for background, I created “treadmill-ish” shapes using poly~. They start from the center of the screen and move with increasing z values which looks like the shapes are coming out of the screen. This way, I could make the point cloud of the person look like it’s moving forward. This poly~ object consists of 20 individual shapes each staggered by a small amount with z values scaled to go between -100 and 10 and wrap around to make them look continuous.

The audio-reactive element is a beat (or bass) detection. The audiobang patch sends a bang when it detects a low frequency, and the bang then triggers an individual shape which, similar to the background, starts from the center and picks a random x direction and comes out of the screen.

Here is a short demo of my roommate dancing to Sleep Deprivation by Simian Mobile Disco, enjoy 🙂

]]>
2502