samuelgo@andrew.cmu.edu – 18-090, Fall 2019 https://courses.ideate.cmu.edu/18-090/f2019 Twisted Signals: Multimedia Processing for the Arts Mon, 09 Dec 2019 06:18:03 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.20 https://i1.wp.com/courses.ideate.cmu.edu/18-090/f2019/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 samuelgo@andrew.cmu.edu – 18-090, Fall 2019 https://courses.ideate.cmu.edu/18-090/f2019 32 32 115419400 samuelgo – Project 2: F8R https://courses.ideate.cmu.edu/18-090/f2019/2019/12/09/samuelgo-project-2-f8r/ Mon, 09 Dec 2019 06:18:02 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3335 For this project I developed an interactive performance system powered by Max.

The basic premise of the patch is to capture and playback gestural input. Max outputs these gestures as MIDI CC data, which are then converted to CV signals that control a hardware synthesizer.

Three Max objects play a major role in this patch:

  1. mira.frame – The graphical interface for gestural input is designed in a mira.frame object. The mira.frame object mirrors the interface on an iPad connected to the computer running Max over WiFi.
  2. mira.multitouch – The mira.multitouch object allows us to collect multitouch information from the iPad hosting the mira.frame interface. Touch state and y-position are the key information collected in this patch.
  3. mtr – The mtr object, wrapped with some custom logic, records and plays back the gestural input data from the mira.multitouch object.

The core engine in this patch can be extended or augmented to support many types of gestural input. In this implementation the graphical interface consists of 8 faders whose values can be set or automated.

The embedded video demonstrates the patch with a live performance.

Project Resources: https://drive.google.com/drive/u/0/folders/1htNu8UGfB6_NB_QtNnzEGnrOOXTfRYm2

]]>
3335
samuelgo – Project 1: Experiments in Spectral Delay https://courses.ideate.cmu.edu/18-090/f2019/2019/11/06/samuelgo-project-1-experiments-in-spectral-delay/ Wed, 06 Nov 2019 07:17:54 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3229 While I had originally intended to implement a spectral delay effect from scratch, I decided it would be better not to reinvent the wheel. Instead I reviewed a number of readily available implementations on which I could expand, settling on Cycling74’s M4L.spectral.delay~ based on Olivier Pasquet’s work.

The basic concept of spectral delay is to implement a delay line in the frequency domain, enabling independent control over delay and feedback parameters for each FFT bin. I experimented with a number of extensions to this basic concept and settled on a few that I thought produced the most musically interesting results.

In Cycling74’s original patch, the delay time and feedback parameters for each FFT bin are configured with multislider objects. When these parameters are instead programmed dynamically it can add movement and texture to the overall effect. One way that I extended the original patch was enabling these parameters to be programmed by performing an FFT analysis on a secondary input signal, scaling the resulting spectral magnitude values to generate delay coefficients. When this option is engaged, the delay time parameter for each FFT bin is updated at the rate of the FFT analysis, creating movement.

In my implementation, the user can select either the primary input or an auxiliary signal to be routed to the secondary FFT analysis that generates the delay time parameters. While considering how else I could exploit the output of the secondary FFT analysis, I experimented with substituting the phase information from this analysis in the re-synthesis of the main analysis, a technique known as cross synthesis. By swapping the frequency or phase spectra of one sound with that of another sound, one can impart qualities of a sound onto another. This method produces especially interesting results when the amplitude spectra of a percussive sound is re-synthesized with the phase spectra of a harmonically rich tonal sound. In my implementation I also allow the user to re-synthesize the spectrally delayed signal with the original signal’s phase information which can produce interesting timbral results.

I later reworked the UI for the patch, adding control for the new functionality I implemented. In the end I believe I improved on the repertoire and utility of the original patch; the processing is capable of subtly augmenting a signal but is easily driven to sonic extremes if that is desired.

I’ve embedded a recording of myself manipulating the patch live, feeding it a drum loop as the primary input and a synth pad as the auxiliary input. The result sounds like a field recording in an electrified rainforest.

Project Resources: https://drive.google.com/drive/u/0/folders/1bAm9KaBNgVL0uBodDaZ_xvt6eol5jGei

]]>
3229
samuelgo – Assignment 4: Just Intonation Harmonizer https://courses.ideate.cmu.edu/18-090/f2019/2019/10/17/samuelgo-assignment-4-just-intonation-harmonizer/ Thu, 17 Oct 2019 20:09:02 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3163 For this assignment I implemented a patch that harmonizes incoming audio. This patch was realized with the gizmo~ object in Max; gizmo~ operates within the context of a pfft~ object, analyzing the results of the FFT operation in order to re-synthesize audio that has been shifted spectrally.

The patch allows audio to be harmonized with the following intervals adhering to just intonation tuning: M/m3, P5, & M/m7. A simple UI allows the user to mix the incoming audio with it’s harmonization, to select the quality of the 3rd and 7th intervals, and to mute the 5th.

The embedded video demonstrates real-time manipulation of the patch processing a monophonic recording of a cello.

The code for this patch can be found at the following link: https://drive.google.com/drive/u/0/folders/1O0wm-jHsYGvnDhIR8q4OJUjityO-iqMK

]]>
3163
samuelgo – Assignment 3: Ayyye … Yaaar https://courses.ideate.cmu.edu/18-090/f2019/2019/10/01/samuelgo-assignment-3-ayyye-yaaar/ Wed, 02 Oct 2019 03:29:31 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2975 This week’s assignment submission brought to you by pirates!

I chose my trusty Amen break sample to serve as convolution guinea pig for this assignment.

I went hunting for acoustical spaces of note around my neighborhood. The first interesting IR I captured was in a hallway in an apartment building. There was a nice twangy high frequency resonance to this hallway, evident in the tail of the captured IR.

The next IR I captured was in the basement of a building. The basement in question was solid concrete and I was expecting a decent IR from the space, but the positioning of the microphone turned out to dampen the perceived spaciousness and alas I was out of balloons …

Next I turned to Audacity in an attempt to mangle my captured IR’s into something more experimental. I stumbled upon the idea of applying an envelope filter effect to the hallway IR; the resulting convolution was satisfyingly “funky”.

An hour of experimentation later I was at a loss for what else to convolve my lovely drum break with. Then it hit me … throw a test signal at it! I generated a logarithmic sine sweep between 500Hz and 20kHz, and sure enough the resulting convolution was weird.

All audio files: https://drive.google.com/open?id=1DACjS4VHza50fQDN9yKv5htPZ3NMyx-7

]]>
2975
samuelgo Assignment 2: Four Head Tape Echo https://courses.ideate.cmu.edu/18-090/f2019/2019/09/17/samuelgo-assignment-2-four-head-tape-echo/ Wed, 18 Sep 2019 03:08:53 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2890 For this assignment I aimed to implement a multi-tap delay w/ tap-offset, mimicking the multi-head tape echo’s of yore.

Mangled sound:

Code:

https://drive.google.com/file/d/1yr-0V6TYl60hNG7rHSMOf_A63oJH6-EI/view?usp=sharing

]]>
2890
samuelgo – Assignment 1: Amen Break Recursive Chop Randomization https://courses.ideate.cmu.edu/18-090/f2019/2019/09/03/samuelgo-assignment-1-amen-break-recursive-chop-randomization/ Wed, 04 Sep 2019 02:09:57 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2791 The “Amen Break” is popularly sampled in Drum and Bass music. In this exercise I sampled the break (from the original song “Amen Brother” by The Winstons), divided the sample into 16 slices of equal length, and randomly assigned slices to every 16th note in one measure of music at the original tempo of the song. The result is a new loop, musical in nature, comprised of randomly distributed temporal elements of the original break.

I then resampled the result of this exercise and executed the same steps again, dividing the sample into slices of equal length, randomly distributing the slices among evenly spaced musical time divisions, resampling the result, and so on, and so forth.

I stopped at iteration 5 because … the process was time consuming. At any rate, without human intervention at each step, the progressive results tended towards being more repetitive and less musically satisfying. A composite .WAV file detailing the evolution of the “chopped” break is linked:

]]>
2791