Many times, I find that people don’t know that conducting an ensemble in the marching arts is very different from conducting a wind ensemble group or conducting an orchestra. Drum majors are often called the “time-keepers” of a marching band or corps. The hardest part of their job is having to deal with the physics of sound given the marching arts’ natural environment, the football field. On top of having to ignore sounds that bounce off the concrete in stadiums or the delays coming from the performing members who are facing the opposite direction, drum majors have to make sure their conducting patterns are consistent, precise, and extremely clear.
This is what inspired me to create this project. I wanted to create a tool that allows beginner conductors to analyze their patterns to help them on the field. For a lot of high school students who are just starting to learn the basics and foundations of conducting a marching band, they have tendencies to rush or slow down certain beats of their patterns. For example, in a simple 4/4 pattern, because of the different lengths of the pathways, a natural tendency for beginners is to be slow on beat 3 and rush into beat 4.
The tool that I created consists of two visuals: one shows where the metronome hits (shown in red), and the other is where the conductor’s beats hit (shown in blue).
I started by coding a metronome. At first, I used an impulse signal but ran into problems when I realized not all the impulses were shown in the visual display because the length of the impulse signal was too short. So to create the metronome click, I recorded the sound of a pen hitting against glass and edited the length using audacity.
For the second display wave, I used a contact microphone to show where the conductor’s beats were landing. This microphone can be connected to a music stand.
From there, all the user has to do is set a tempo they want to conduct at and hit the record button to record their conducting patterns. The recording is displayed on the two different visuals where the conductor can analyze if their beats are consistent or not and figure out how they can fix it.
Here is the link to the patch:
https://drive.google.com/open?id=1HrdEWJYiKBX41kkMyY01yWA7dxYtgG9J
]]>Originally, I started off with the simple idea of taking a few shapes and making them bigger with the peak amplitude of the audio file inputted. But I started to explore more, and I began to add multiple of objects, creating strings of circles using jit.mo.func. And I kept adding more and exploring more until I realized I could not only use amplitude to change the size of the shapes but also the speed or frequency at which they were traveling.
Overall, I thought this was a really fun project for me to do. The max patch I created allows users to tweak and divert from what I created in the video linked below. I just personally liked the look of fireflies flying around.
Here are the max patch and sample video:
https://drive.google.com/open?id=1o2-DXCs2YhIeBwMUNCBw15V470eoDJvF
]]>Here is my result:
I couldn’t translate what was coming out of Max into SoundCloud, but I put the original chime sounds through the left channel and the delays on the right channel. If you were to only put in one earbud and listen to the patch on the Max software, it would sound different from if you put in both earbuds and if you only put in the other one.
Here is my code to do that:
https://drive.google.com/drive/folders/16lajp5RVWHUJlxE5NR4ropWmR_3XqIRU?usp=sharing
]]>For this assignment, my 4 IR recordings are as follows:
I recorded the first 2 IRs by popping a balloon and recording it with my phone. The 3rd IR I used is from a video of a crackling fireplace. I uploaded the audio to audacity and added an echo effect with a delay time of 0.2 seconds. The last IR is from my friend’s performance of his solo. I liked that you could here the audience in the background, and I wanted to see how that would affect the convolution of the original recording.
Here is the link to my 4 IRs, the original recording, and the convoluted results: https://drive.google.com/drive/u/2/folders/17-l876Z0y5hSn8wZCPRb_zWJIGr_Upt9
]]>I originally just copied the track and added multiple time delays to generate the time shift. After doing so, I realized I wanted to change it more, so I added a pitch shift to 2 of the selective track delays to create this celestial effect.
This was the ending result:
I think the effect that the song gave was pretty neat. I think if someone were to use this code, I’d advise them to stay away from using songs with a lot of un-pitched percussion instruments. I originally used a different song, but the drum beats and the constant hitting of the hi-hat stuck out a lot through the time shift manipulation.
]]>Recently, a friend of mine introduced me to an app called Huji Cam – a retro disposable camera photo app that adds red-hue filters to your photos as soon as you take a picture. These filters are completely random and you cannot choose or change them.
As an example to demonstrate what Huji does, I took the following 4 photos of my apartment keys. For comparison, I used my normal phone camera for the first one (left) and I took the other 3 using the Huji app. You can see that each of the photos have a different filter:
As I was playing around on the app and taking random photos, I got the idea to layer pictures together to see if I could come up with some kind of abstract art. I walked into the nearest hallway, and, standing at one end, I took a photo using Huji, took a step forward, took another photo, and repeated the process until I reached the end of the hallway.
This is the compilation of the 30 images that I used to create “Down the Hallway” —>
I then tested a few free photo layering apps that I found on the app store and came across Photoblend.
Using this app, I took the first 2 images that I shot in the hallway and blended it together. I then took the edited photo, and layered the 3rd photo on top of it. I repeated this process until all 30 photos were blended and layered together.
This was the final result:
For your reference, here are some iterations showing the evolution of the photo: