jkasbeer@andrew.cmu.edu – Experimental Sound Synthesis https://courses.ideate.cmu.edu/57-344/s2017 57-344/60-407 Spring 2017 Thu, 13 Jul 2017 16:44:29 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 https://courses.ideate.cmu.edu/57-344/s2017/wp-content/uploads/2017/01/cropped-noface-drawing-tiny-32x32.png jkasbeer@andrew.cmu.edu – Experimental Sound Synthesis https://courses.ideate.cmu.edu/57-344/s2017 32 32 A Cyborg Duet in Ode to Bach https://courses.ideate.cmu.edu/57-344/s2017/a-cyborg-duet-in-ode-to-bach/ Mon, 06 Mar 2017 20:35:11 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=302 Continue reading "A Cyborg Duet in Ode to Bach"]]> Summary
To accomplish the goal of having a computer perform in real-time with a human, we “faked” a violin duet (Abby & Nick) in accompaniment of a MIDI keyboard running through Ableton Live (Joey).

With this, we were able to produce a version of Bach’s Partita in D Minor, No. 1 that was truly a unique addition to the alterations/remixes of classical music that have been done in the past.

.wav
.mov

Process
The beginning of our ideation naturally started with how we could combine human live-performance with computer-performance. From the start, we were lucky enough to have both Abby and her electric violin; Nick is always excited to put his Max abilities to the test, and Joey was quick to be interested in getting on a MIDI keyboard to fill in any empty space that was naturally going to exist in our piece. After our quick decisions on the real-time human performance, we went through a few ideas concerning the real-time performance from the computer.

At first, we considered using a pedal board to allow Abby to create, play, and pause loops, but quickly realized this would be a lot of strain on her and could be much more “computerized” anyways. We ultimately decided we would “fake” a duet using the Max patch Nick made with Joey on the MIDI keyboard connected to Live.

Max Patch
The pitch and volume of the incoming signal control both the playback position and the volume of grainstretch. Using gbr.yin to track the pitch of Abby’s violin and meter, we’re able to track the incoming amplitude. After recording audio into Silo#0, a timer sends the length of the recorded audio to a scale object attached to the tracked pitch of the violin so that it can accurately control the position of the grain playback.
As a side-note worth mentioning, Nick built in a lot of extra functionality that we didn’t use (e.g. the transport control & the ability to record and loop the data from the violin).
[source code] https://gist.github.com/nikerk34/814ca8a7e43eca9f5f5b4f1c9fd48a54
[externals] http://ftm.ircam.fr/index.php/Gabor_Modules
http://www.maxobjects.com/?v=objects&id_objet=4758

Presentation and Closing Remarks
The presentation went extremely well other than a few technical difficulties in the beginning — sadly this was strictly due to not turning the “audio on.” Other than this hiccup, the volume on our keyboard for Joey’s side of the performance could have been slightly higher, but we received great feedback from the class.

Performance
https://drive.google.com/open?id=0BzxqdpE9VUgJb0VuTUoxVG52THM
[AUDIO COMING SOON]

Credits
Technology and production by Nick Erickson (max programming), Abby Adams (live violinist), and Joey Santillo (live synth); documented by Jack Kasbeer.

]]>