jkorzeni@andrew.cmu.edu – Experimental Sound Synthesis https://courses.ideate.cmu.edu/57-344/s2017 57-344/60-407 Spring 2017 Thu, 13 Jul 2017 16:44:29 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 https://courses.ideate.cmu.edu/57-344/s2017/wp-content/uploads/2017/01/cropped-noface-drawing-tiny-32x32.png jkorzeni@andrew.cmu.edu – Experimental Sound Synthesis https://courses.ideate.cmu.edu/57-344/s2017 32 32 Head Test Indian Pattern https://courses.ideate.cmu.edu/57-344/s2017/head-test-indian-pattern/ Mon, 17 Apr 2017 02:56:12 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=482 Continue reading "Head Test Indian Pattern"]]> Taking Kirk Pearson and BIT’s Indian Head Test Pattern (which I helped compose and produce) as a springboard, I delved into exploring convolution reverb. I took both other pieces I have composed and sounds I have recorded, edited them, then used them as impulse responses to create this drone-y piece that moves through different sound worlds. The technique that struck me most about both Alexander’s music and the other stuff he showcased us was convolution reverb, and I hoped to emulate the really beautiful soundscapes that I heard.

]]>
Let’s Go: The Go Step Sequencer and Melody Creator https://courses.ideate.cmu.edu/57-344/s2017/lets-go-the-go-step-sequencer-and-melody-creator/ Wed, 05 Apr 2017 19:20:33 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=446 Continue reading "Let’s Go: The Go Step Sequencer and Melody Creator"]]> Technical:

We used OpenCV to process a live stream of a Go board captured with a Logitech HD webcam. The OpenCV program masked out the black and the white in the stream in order to isolate the black and white go pieces on the board. It then used OpenCV’s blob detection algorithm to find the center of each piece on the board. The program detected the edges of the board to interpolate the location of each intersection on the board, and compared the blob centers to intersection positions in order to output two continuously updating binary matrices that indicate where all the black and white pieces are on the board. The matrix data was then send to Max using OSC.

Within the Max patch, the matrix of black pieces was used to control a step sequencer of percussion sounds. The matrix of white pieces was split in half, where the left half controlled 7 drone sounds and the right half controlled 6 melody lines. Depending on where the white pieces were, drones would sound at different points in the measure and certain notes of the melodies would play.

Compositional:

We used Logic Pro X to create all the sounds that were heard on the board. The percussive sounds came from numerous drum patches designed by Logic ranging from kick drums to frog noises. The melodic pieces was based off of A Major suspended drone(ADEFA#C#E), synth bass noises and LFO sound effects. Since the drones were created in a loop, there was a bit of a problem with clipping at the end of 4 bars, so we created swells within the drones so they would fade out by the time the loop would begin to clip. The melody was on a fixed loop as soon as a stone was put down and faded naturally while the percussion sounds were based on a 3+3+2+2+3 rhythm based on the 13 spaces of the board.

Video:

Patch:
https://drive.google.com/open?id=0B3dc0Zpl8OsBNTAwZEJnczF3Q1k

]]>
Sound Editing in 360 Degree Animation https://courses.ideate.cmu.edu/57-344/s2017/sound-editing-in-360-degree-animation/ Tue, 04 Apr 2017 23:49:49 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=373 I’ve linked a step by step guide for how to work with 360 degree positional sound in animation that i have wrote. It has fun pictures! I also linked a brief spatial sound test on youtube.

How to 360 sound design

]]>
Project 2: Mbira – Omnichord Duo https://courses.ideate.cmu.edu/57-344/s2017/project-2-mbira-omnichord-duo/ Sun, 05 Mar 2017 20:49:38 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=258 Continue reading "Project 2: Mbira – Omnichord Duo"]]> Concept:

With our project, we wanted to use the unique instruments that some of the group members owned in a live performance manner. In the end, we discovered that both the Omnichord and the Mbira have peaceful and dream like qualities to them. We decided, then, to pair up these two instruments to make a dream-like live performance.


Mbira


Omnichord

Max Patch:

The max patch is structured to allow the user to create loops of live instrumentals in real time, while projecting a granularized version of the loop through a speaker across the room from the dry sound. We decided to use granular synthesis to further the trancelike direction we wanted our piece to have. The user can control grain parameters such as duration, rate and slope by adjusting the range for each parameter. For every grain produced, the parameters are randomized based on the current ranges selected. This gives the piece a sense of movement and change. The user can also control the levels of all speakers within the patch, and create crossfades between the dry and wet sounds for each loop. We used the grooveduck object to handle the loops, and the 03h granular synthesis library along with the poly~ object to create the grains. We also utilized the parallel parameter of the poly~ object to parallelize the computation of each grain in order to reduce CPU load.

In terms of changes to make in the future, one change that we would make would be to utilize a high-pass filter on the Mbira to remove the physical sounds that might have been generated from the way it is played. We could also utilize movement between the speakers and a more solid sense of sound direction to prevent the piece from becoming stagnant.

Link to the Max Files:

https://drive.google.com/drive/folders/0B3dc0Zpl8OsBMk9MeFJPTGRCVzQ?usp=sharing

]]>