kschaer@andrew.cmu.edu – Experimental Sound Synthesis https://courses.ideate.cmu.edu/57-344/s2017 57-344/60-407 Spring 2017 Thu, 13 Jul 2017 16:44:29 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.28 https://courses.ideate.cmu.edu/57-344/s2017/wp-content/uploads/2017/01/cropped-noface-drawing-tiny-32x32.png kschaer@andrew.cmu.edu – Experimental Sound Synthesis https://courses.ideate.cmu.edu/57-344/s2017 32 32 Some kind of atmosphere https://courses.ideate.cmu.edu/57-344/s2017/some-kind-of-atmosphere/ Mon, 17 Apr 2017 14:34:09 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=511 Continue reading "Some kind of atmosphere"]]>

In this short experiment, my goal was to create an atmosphere with some elements of a familiar setting, and other elements that would cause the listener to question the environment. I exclusively used field recordings, but processed each sample to varying degrees, allowing some elements to remain recognizable and other elements to be transformed into curious effects.

]]>
Pure Data: random drone reverb https://courses.ideate.cmu.edu/57-344/s2017/pure-data-random-drone-reverb/ Wed, 05 Apr 2017 14:43:22 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=432 Continue reading "Pure Data: random drone reverb"]]> One element I am interested in including in the Metamorphosis section of the installation is some tracks inspired by the flapping of a butterflies wings. This PD patch attempts to create a couple “fluttery” drones, and randomly reverberate these drones to create a more dynamic feeling. Download the txt file and change the extension to .pd and it should open. randomreverb

]]>
Personal Project: Sonification of an Eye https://courses.ideate.cmu.edu/57-344/s2017/personal-project-sonification-of-an-eye/ Wed, 05 Apr 2017 14:37:20 +0000 https://courses.ideate.cmu.edu/57-344/s2017/?p=424 Continue reading "Personal Project: Sonification of an Eye"]]> Sonification of an Eye

I have always been fascinated by eyes. The colors, shapes, and textures of a person’s iris are as unique to that individual as their fingerprints, and so identifiable that biometric systems are able to identify one person from another with ease.

For my project, I sought to sonify the iris, to create an experimental composition based on the characteristics of my own eye. First, I needed to photograph my eye. I did this simply using a macro attachment for my phone. With a base image to work with, I then converted the image of my eye from polar to rectangular coordinates. This has the effect of “unwrapping” the circular form of the eye, so that it forms a linear landscape. These “unwrapped” eyes remind me somewhat of spectrogram images; this was one of the observations that inspired this project.

Next, I sought to sonify this image. How could I read it like a score? I settled on using Metasynth, which conveniently contains a whole system for reading images as sound. However, the unprocessed image was difficult to change into anything but chaotic noise, so to develop the different “instruments” in my composition, I processed the original image quite a bit.

Using ImageJ, a program produced by the National Institute of Health, I was able to identify and separate the various features in my eye into several less complicated images. These simpler images were much easier to transform into sound. Separating the color channels of the image allowed me to create different drones; identifying certain edges, valleys and fissures allowed me to develop the more “note like” elements of the composition.

These are just a few of the layers I created for use in my soundscape; there were seven in total. In Metasynth, I was able to import these images and read them like scores. Each layer has a different process, but this is the general premise: I would pass a wave or noise (I chose pink noise for most tracks) into a wavesynth, grainsynth, or sampler. I would define the size of the image, and a tone map to divide that image into frequencies based on the pixels of the image. For instance, a micro-32 map would divide my 512 pixel high image into 16 notes. I am still working on an understanding of exactly how Metasynth works, but from this point the synthesizer can read the image like a score. Through experimentation, I developed a loop from each of the 7 images I had processed from the photograph of my eye. Each loop represented one rotation around the contours of the eye.

In Audacity, I took all of these loops and mixed them into an experimental composition. As a visual aid to what is being played, I re-polarized the images to produce the video I played in class. The sweeping arm across the image is representative of the location on the eye that is being sonified at that time, however since I was creatively mixing between layers in the composition (vs just fading between one image layer to the next in the video) it wasn’t always easy to understand the connection between image and sound. A further improvement to this project would be to create a video where the opacity of the image layers is representative of the exact mixing I was doing in the audio track.

]]>