gbutler@andrew.cmu.edu – 18-090 https://courses.ideate.cmu.edu/18-090/f2016 Twisted Signals Thu, 22 Dec 2016 20:26:10 +0000 en-US hourly 1 https://wordpress.org/?v=4.6.28 https://i2.wp.com/courses.ideate.cmu.edu/18-090/f2016/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 gbutler@andrew.cmu.edu – 18-090 https://courses.ideate.cmu.edu/18-090/f2016 32 32 LaunchControlXL Drum Machine!!!! https://courses.ideate.cmu.edu/18-090/f2016/2016/12/04/launchcontrolxl-drum-machine/ Sun, 04 Dec 2016 20:20:44 +0000 https://courses.ideate.cmu.edu/18-090/f2016/?p=499 For my project I used Max to turn my Novation LaunchControlXL (a midi device with 8 sliders 24 knobs and 24 pads) into a drum machine that can be used for live performance or just recording/charting out ideas. I turned the 16 pads at the bottom into a step sequencer using a pretty cool system, used the sliders for volume control of the individual channels and the knobs for individual drum sound design and master buss effects.

I’m very happy with everything I was able to do and the product that I have right now, but there are some things I still want to add to it. For instance I wanted to be able to make a system that could do polyrhythms, but was unable to get two transports to work at the same time at different tempos :(. I want to be able to send midi out to the launch control to light up the step sequencer at every step, and I want some better/different effects.

Other than that I’m pretty happy with it, and hope it helps me create some cool ideas in the future.

P.S. – I could not figure out how to upload audio files to github so I just put the patch folder in a google drive folder.

https://drive.google.com/open?id=0B-Zy9oeEXbOwV29lSFJwZkVaMGc

  • Gladstone Butler
]]>
West Sumatra https://courses.ideate.cmu.edu/18-090/f2016/2016/11/07/west-sumatra/ Mon, 07 Nov 2016 09:29:41 +0000 https://courses.ideate.cmu.edu/18-090/f2016/?p=481 For my project I created an experimental performance environment within max. I combined pre-picked audio and video samples with live performance and live manipulation.

This project was largely inspired by the culture and musicality of Javanese musicians. All of the audio samples I used were field recordings from around West Sumatra. The demo I made begins with an Islamic prayer modulated by an FFT Delay, and moves towards ritualistic dance music with freqmod paramaters all the way to cowherding and rice-planting music on a multi-band delay.

The music I performed live on top of it (in this demo at least) was meant to enhance the experience and not really stand out too much. It kind of felt like I was just jamming with the Javanese people when I did it. I performed on a vibraphone which is pretty noticeable in a lot of parts, a djembe, a tambourine, and a gourd with a hole in it that sounds pretty awesome. The live looper I made for this part of the project was a big endeavor.

Originally I had planned on doing live DMX light manipulation, but I decided on a video sample of Indonesian people jamming somewhere in the fields of West Sumatra instead. I equipped it with a kaleidoscope and color manipulation parameters to make it more interesting, the parts of the video that you can make out are really the most beautiful ones. I have been messing around with the color parameters in my run-throughs of the piece but the video I’ve been recording keeps corrupting. I have included the patch below so anyone can have a go with it and see it first hand.

here is the most recent demo performance:

 

google drive folder with all the samples and files: https://drive.google.com/drive/folders/0BzJ8e5y3EPeuX1VLTnlYalFDVmc?usp=sharing

here is the code for the patcher:

]]>
Muse https://courses.ideate.cmu.edu/18-090/f2016/2016/10/17/muse/ Tue, 18 Oct 2016 03:10:23 +0000 https://courses.ideate.cmu.edu/18-090/f2016/?p=435 For my project I put together a composition that I can perform live with Ableton, MaxForLive and my LaunchControlXL. I recorded some loops of myself playing marimba and vibes, and that’s what the very beginning consists of. The next blip bloopy kind of instrument was a product of FM synthesis and the FFT Delay we used in class.

The voice sample is just something I found on the internet. The point of the composition was not really to hear the speech but to hear what was happening to the sounds. My intention was to add effects and modulate it in such a way that it morphs the concepts of speech and musical sound.

The first effect I added was an FFT delay I made simply by reading the samples back into the buffer after they are read by the index. The second was a filter with the Q pretty much maxed out. Then I added Ableton’s resonator effect, which added and accentuated harmonics that I played around with throughout the composition. After sending it through another simple delay I EQ’d it to find the really resonant frequencies and it kind of rests in that general area for a while. I’ve included a link to the piece and a link to the code for the embedded pfft~ for my homemade FFT delay.

 

 

 

 

 

]]>
Convoluted Marimba https://courses.ideate.cmu.edu/18-090/f2016/2016/10/03/convolved-marimba/ Mon, 03 Oct 2016 06:42:43 +0000 https://courses.ideate.cmu.edu/18-090/f2016/?p=343 For my project I used four impulse responses on a recording of myself playing the first page of “Merlin” by Andrew Thomas. The marimbas that we have at CMU sound really bad and the convolution seemed to make them better!

The first sound file is the impulse responses. A balloon popped in a dry space with the hand recorder placed behind a reverberating tam-tam, a balloon popped at the bottom of the stairs in the new wing of CFA, a heavily LFO’d sine wave playing the pitches that are pretty much the tonal center of the piece, and those same pitches played by a sine wave, but rhythmically.

The second sound file is the recording played start to end four separate times, with the dry wets of the convolution reverb effects automated to go from 0 to 100%

I used the Convolution Reverb Effect from the Max for Live Essentials pack in Ableton Live for this project.

 

 

 

 

]]>
Timeshifted Vortex https://courses.ideate.cmu.edu/18-090/f2016/2016/09/19/timeshifted-vortex/ Mon, 19 Sep 2016 04:00:18 +0000 https://courses.ideate.cmu.edu/18-090/f2016/?p=239 For my project I produced a track I called “Timeshifted Vortex”. The whole thing was inspired by an M4L device I made with a simple tapin tapout system, controlled by an LFO and a low-pass filter. I used it on a drumloop throughout the whole track, but it is most apparent in the intro.

The first link is a demo of the M4L device with the same drum loop I used in the track. It’s really just me messing around with the parameters for a little bit. The second link is the track.

https://soundcloud.com/gladstonebutler/taped-delay-m4l-device-demo

 

]]>
The Simple Delay Feedback Loop https://courses.ideate.cmu.edu/18-090/f2016/2016/09/07/the-simple-delay-feedback-loop/ Wed, 07 Sep 2016 04:32:35 +0000 https://courses.ideate.cmu.edu/18-090/f2016/?p=163 For my take of the project I used the “Simple Delay” effect in Ableton and used it on a loop I made on some random sample kit. The Delay Time was synced to 4:5, the feedback was at 34%, and the effect was 100% wet. I resampled it over and over again, switching between two channels with the same Simple Delay settings. I repeated the process until it was completely out of the grid. I also made some loops in the default 808 kit to keep time, but also for fun.

 

]]>