eludwick@andrew.cmu.edu – 18-090, Fall 2019 https://courses.ideate.cmu.edu/18-090/f2019 Twisted Signals: Multimedia Processing for the Arts Mon, 09 Dec 2019 13:48:14 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.20 https://i1.wp.com/courses.ideate.cmu.edu/18-090/f2019/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 eludwick@andrew.cmu.edu – 18-090, Fall 2019 https://courses.ideate.cmu.edu/18-090/f2019 32 32 115419400 eludwick – Project 2 – LEAP Audio Effect Suite https://courses.ideate.cmu.edu/18-090/f2019/2019/12/09/eludwick-project-2-leap-audio-effect-suite/ Mon, 09 Dec 2019 13:42:13 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3366 I made this patch with the intention of creating a set of audio effects that could be incorporated into some form of sound, whether mp3 or live, that could be mostly controlled by LEAP using hand gestures.

In its final form, the patch I made controls reverb, a multi-voiced vocoder, and an LFO that controls the amount of pitch bend the incoming audio receives and how fast it shifts. The patch is slightly finicky in that you need to activate the LEAP data in all the sub-patches before audio will come through. Parts of the sub-patches are patches I found online or in tutorials, but all the LEAP data tracking is my own. I also decided to use Luis Fonsi’s Despacito since we used that in class several times.

When first opened, the patch contains two different input options, a hand tracking toggle, and access to the sub-patches that contain the individual effects. The mp3 is by far easier to work with. The adc~ works, but not as effectively as I would have liked.

Once you decide the input and hit the desired effect’s number key, you can open up the corresponding sub-patch.

The reverb patch is fairly straight-forward. It uses the reverb patch found in max > help > examples. The hand gestures directly effect the basic parameters of a reverb effect. The most noticeable are the Decay and Size, while the Diffusion and Hi Frequency Cutoff are more subtle.

The vocoder is names after Gir from Invader Zim. The dry/wet part of the patch is from – https://www.youtube.com/watch?v=mi9CjZxk8zs and the vocoder effect is from – https://www.youtube.com/watch?v=4feOFLX6238. I used these because both were easily controlled by the LEAP data, while other pre-made versions required the mouse to control the various parameters.

Lastly, is the LFO device. The base effect of this is loosely based on this video – https://www.youtube.com/watch?v=uyzY_ZP54pA. However, I altered it in order to get a different effect. I was trying to replicate an effect I had heard from a soundtrack, but I ended up getting more of a whammy bar/warbly effect.

Overall, I am very happy with this. It is very fun to use, and can be very easily modded to control different effects or to add new ones. I am planning on continuing to work and develop this because I think it could eventually turn into a very useful tool, and I think this is a very good point to be at for the first iteration.

https://drive.google.com/open?id=1xMT-od471FLCn6pOWAYY9XWCGeTdMlIO

]]>
3366
eludwick – Project 1 – LEAP Keyboard https://courses.ideate.cmu.edu/18-090/f2019/2019/11/06/eludwick-project-1-leap-keyboard/ Wed, 06 Nov 2019 13:41:41 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3266 This project was made using a Leap Motion sensor. The sensor itself is designed to track hands. The Max patch I am using tracks just the fingertip data, converts that into a MIDI signal, then outputs that through a cycle object to produce sound. The tracking component of the patch is from the downloadable Leap-Max file that Jesse posted on the Slack, and the sound production is based off of Jesse class demo of the Leap Motion sensor.

The exact way that this works is the patch tracks the x-y-z data of the fingertip. X data is pitch, Y data is on/off, and Z data is volume. I designed this to act as closely to a piano keyboard as possible, which is why the Y data controls the on/off of the sound. This simulates the action of pressing down a key.

https://drive.google.com/drive/folders/1gxu0LY6NpyEr03aOj5RZeaOhtDbv_HCa?usp=sharing

]]>
3266
eludwick – Assignment 4 – Beep/Boop in the Machine https://courses.ideate.cmu.edu/18-090/f2019/2019/10/16/eludwick-assignment-4-beep-boop-in-the-machine/ Wed, 16 Oct 2019 11:59:22 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3150 For this assignment I chose to use spectral processing to analyze incoming audio and have it convert and output into a sine wave. The patch is pretty straight-forward: audio comes goes through the pfft to analyze the frequency which then goes through a combination of objects that is able to turn the frequency into an integer. It then is converted into MIDI data that is scaled one time to get the generated number into a normal MIDI range, then a second time to generate new frequencies for the cycle~ object.

The combination of objects I used to convert the frequency into integers was found in the forums. I will attach a link to the thread. The exact post is near the bottom and was posted by Jean-Francois Charles. I only used part of the patch and also colored the parts I used in blue. The audio I used to generate everything is the “Anton” file that can be found in the Max audio tab.

https://cycling74.com/forums/converting-real-time-audio-into-frequency

https://drive.google.com/drive/folders/1GlmiPqBMVEvIyXNLXFlqrgDLZtQ5MVez?usp=sharing

]]>
3150
eludwick – Assignment 3 – Lucier Eats Lucier https://courses.ideate.cmu.edu/18-090/f2019/2019/10/02/eludwick-assignment-3-lucier-eats-lucier/ Wed, 02 Oct 2019 12:43:05 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3055 For this assignment I used the original Lucier recording that Jesse provided. I did this mostly because I thought it would be interesting to have Lucier speak as if he were inside his own recording.

For my two regular IRs, I went to North Park in Allison Park to an old well call the Fountain of Youth that has a small domed entryway before going further inside to the well. I recorded my first IR in the entryway. For the second, I wanted to do something completely different from the Fountain IR so I recorded this IR in the CFA stairwell.

For my first experimental convolution, I recorded Lucier’s “I’m Sitting In A Room” being played in the Fountain of Youth entry. Then I took the last cycle of his recording (around the 45 minute mark) and recorded that. Then I ran it through Audacity to normalize it and plugged it into the Max patch.

My second experimental convolution was made from the A-Game Synth audio loop from Logic Pro X. For this, I took the loop, cut up about 20 different sections of the audio, then stitched them back together at similarly sized regions to create a new sound wave. Then I exported from Logic, loaded into Audacity, and ran the new audio file through Paulstretch. I kept the stretch factor at 10 seconds, but I reduced the time resolution down to 0.05 seconds. Then I added a fade-out to it to better simulate this “room”.

I uploaded all the IRs and all the convolutions as two separate files to Soundcloud. On both, the order of playback is: CFA Stairwell, Fountain of Youth, A-Game Synth, and Lucier. All the separate files, as well as the original Lucier file are in the shared Google Drive folder.

https://drive.google.com/open?id=1ksIiY-WbGaDIanr9L9jDofBgw-HUMIj4

]]>
3055
eludwick- Assignment 2: Hot Mess https://courses.ideate.cmu.edu/18-090/f2019/2019/09/18/eludwick-assignment-2-hot-mess/ Wed, 18 Sep 2019 12:08:32 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2942 This patch is built from Jesse’s 1st example. The original was a delay patch for audio that also incorporated pitch shifting and feedback. I took that, created three instances of it, and plugged in three different mp3 flies from the on-board audio list. Then I took the delay part of the patch and created three instances of that and connected the three outs in alternating patterns. The overall result is that I have three delays on each mp3 file: one that affects each individual mp3 and 2 others that pair the delay affect with one of the other mp3 files. 

To use it, make sure all the mp3 files are on the “loop” setting, then toggle them all on. The integer boxes affect the delay length and pitch shiftiness. Each of the lower patches affect only the delay length. There are several message boxes that are attached to the tapin~ inlet. The “freeze 1” stops all currently incoming data and keeps only the data that has already gone through, while the “freeze 0” resumes the input as it was. The “clear” button wipes any saved data that has already gone through, in this case, essentially setting that particular audio route to a just-started state. The additional delay patches affect pairs of the larger patch sections, pairing 1/2, 1/3, and 2/3 from left to right. Adjusting the delay length here will affect both the connected signals in the same manner. This allows three separate delay lengths for any given signal and can create rather complex lairs of delay, while the “freeze” and ”clear” buttons for some creative expression and more control over the output. 

https://drive.google.com/open?id=1U6VTqGjUWpQrW5suiFGNvUgMeyAtKqBz

]]>
2942
eludwick – Assignment 1: “Burned” Image https://courses.ideate.cmu.edu/18-090/f2019/2019/09/04/eludwick-assignment-1-burned-image/ Wed, 04 Sep 2019 12:14:13 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2855 For this assignment I chose to use an image conversion program to alter a picture. The program is called Image2Icon. It takes any picture file and can change the format so that the image can be used as icons on you personal devices. The feature I focused on was removing a background color from the image. I feel like the overall effect is visually similar to when old fashioned film projectors would burn through the film reel.

https://drive.google.com/open?id=17iIhon3EUlHm9qhTsiu0gADy7dm0e6-t

]]>
2855