The roles for the project were the following:
Nick Ericson – Max programming
Jack Kasbeer – Research and ideas
Arnav Luthra – Sound Editing & Documentation
Project Summary: To approach the project goal of making a sound space, we decided to literally create a space with an interactive sound piece. Our initial idea was to create a sort of “room” where a webcam tracks the movements of people within a designated space and triggers certain sounds based on their location. After configuring Jit and a webcam, we found that the space we had was rather enclosed and the software could only really track one person well. From that limitation we got the idea to try and create an experience of claustrophobia for a single person who enters the space. We mapped out our field recordings (which were edited in different ways) to different parts of the box deliberately using a mix of harsh and subtle noises to create a dynamic space.
Recordings:
Arnav: For my recordings, I set up the microphone in two environments: driving around in a car and cooking in my kitchen. In the kitchen I was making some noodles so I was able to get a crackling sound from the noodle package and then a crunching of the noodles as I broke them up. I also got the sound of onions sautéing in a pan. We tried to use these sounds as more textural background noises. From the car I was able to get the sound of a turn signal, some general engine and some ignition sounds. The best sound I got from the car recordings, however, was the sound of the car door closing. I took this sound, looped it and added ample delay to get a loud harsh thud noise which ended up being central to our piece.
Nick: Some of my top field recording acquisitions were the radiator in my girlfriend’s apartment, a rattling fan in a gates stairwell, and a creaky door spring. My recording method for stationary objects was to place the recorder directly next to the sound source and walk away for a few minutes.
Programming: The patch uses the cv.jit.track object to follow the participant’s position in real space and map that onto nodes. By clicking on the participant’s head as they enter the space, jit track continuously updates the patch on the location of the participant’s head within the space.
We then use the tracking data’s proximity to each node to determine the volume of looped field recordings that we mixed and spatialized with the hoa.2d suite.
Setup: The day before the presentation, we booked the media lab to setup and troubleshoot our project. We mounted a webcam to a beam in the ceiling of the lab and used a hefty amount of masking tape to secure it. We had some issues getting the webcam to work properly with Nick’s laptop and Max but we managed to get everything working in the end. Once we had everything working, we played around with the arrangements of the sounds in the space and made last minute edits to the sounds.
Presentation and Closing Remarks: Below are the videos we recorded of the brave participants in our project. Luigi’s interaction with the project was definitely very interesting!
Arnav: It was nice to hear in the reflections from our classmates that the car door sound was harsh and jarring as that was exactly the effect I meant for it to have. I really thought that the comments on having the space’s sounds evolve over time so that the participant doesn’t get “comfortable” were really valuable and if I were to continue on this project that is definitely something I would implement.