Personal Project: FFT Foot Pedal for Classical Guitar

Because the sound produced by a nylon string guitar decays quickly after a string is plucked, the instrument is not very useful for long sustained notes. Inspired by the sustain pedal present on most pianos, I created a software patch in Max 7 that allows the users to use a MIDI foot medal as an enhanced sustain pedal for guitar. By pressing on the foot pedal buttons, a guitarist can freeze their current sound and artificially sustain it, and overlay up to 5 sustained sounds on top of each other. The guitarist can also use the pedal to slowly fade out all currently sustained sounds, or just the last one they added.

Screenshot of Max Patch

Hardware For this project I attached a bridge pickup to a classical guitar, and sent the analog signal through a MOTU 4pre and into my laptop. For the foot pedal, I used a Logidy UMI3 MIDI foot controller.

Software I used Jean Francois Charles’ freeze patch as a starting point for my patch to save a matrix of FFT data for a given sample. This sample is then repeatedly resythesized using an inverse FFT to produce the drone, with some rev3 reverb added on top. Several drones are able to be sustained at once by adding the signals. Starting and stoping the drones is done gradually using the line object.

Source Code

Golan Levin Reflection: Visualizing Music Performances

What stuck out to me the most from Golan Levin’s lecture was the visualization of an orchestra that consisted of projecting close ups of peoples faces as they listened to the orchestra play a piece. When I think of visualization of music, I typically think of an animation that synchs up with the music and is carefully designed to extract a particular emotional response from the listener. This was very different, and the brilliance of the piece was how organic it was. Audience members listening to the orchestra could see in detail how the music effects others by looking at the projection, allowing them to compare their own experience of the music to that of others. They weren’t led into a specific emotional response based on the visuals, rather their listening experience was enhanced by adding visuals of natural human response to what they were listening to. By projecting several faces behind the orchestra, audience members can clearly see that everyone experiences the same piece of music differently, and not feel like there has to be a “right” way to experience an orchestral performance.

Nowadays, many orchestras are struggling with ticket sales and are trying to remedy the situation by adding modern twists to concerns including playing popular music(For instance the Pittsburgh Symphony’s FUSE concert series) and incorporating visual art into performances. While its great that orchestras are making an effort to find a place for themselves in the modern world, focusing too much on adding flashy visuals to a performance can take away from the music. Classical music is beautiful in its own right, and the example Levin presented in class is a great example about how technology can be used to enhance the experience of classical music, instead of overshadowing it.

Also, as an example of how the classical music concert can be modernized without ruining the musical experience, here is an example of how the Toronto Symphony cleverly uses graphic design to help audience members understand the structure of the pieces they are listening to

Project 1: Leap Motion Controlled Ambisonics

Presentation setup in Media Lab

For our project, we incorporated a leap motion into a field recording soundscape composition in order to create an effect of organic control of spatial motion. Our project was created in Max MSP using the Higher Order Ambisonics Library to place sounds in space using an 8 speaker setup. We also used the aka.leapmotion library to capture palm position data in the X and Y plane. In the patch, the user can dynamically move the source of each sound clip in the room by moving their hand over the leap motion.

The sound clips featured in our project include field recordings from moving robots, ticking clocks, and cooking sounds such as boiling, sizzling, chopping and pouring. In our sound design we sought to take these everyday sounds and manipulate them into something that sounds alien and sci-fy to create an ambient soundscape.

Compressed stereo mix of our composition:

Max Patch in Presentation Mode
Full Max Patch

Instructions for Use To begin recording palm data, simply connect a leap motion to your computer and press the purple button on the Max Patch. To select a sound clip, press the corresponding number on your computer keyboard. Once a sound clip is selected it will begin to play, and you can move your hand in the XY plane above the leap motion to control the position of the sound in the room. To lock a sound you are moving in place, press the space bar. To lock a sound in place and begin moving another, press the number of the new sound you would like to move. You can start and stop clips by pressing the corresponding green and red buttons(fade in and fade out is built in), and adjust the level of each sound clip by moving the corresponding slider.

Screenshot recording of leap motion moving sound clips(Silent)

Click here for Max patcher code and supporting sound clips

Contributions
Sara Adkins: Recording, assisting sound design, main max patch development, documentation
Dan Moore: Recording, main sound design, assisting max patch development, performance
Estella Wang: Recording