jhueteso@andrew.cmu.edu – 18-090, Fall 2019 https://courses.ideate.cmu.edu/18-090/f2019 Twisted Signals: Multimedia Processing for the Arts Sat, 07 Dec 2019 22:32:15 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.20 https://i1.wp.com/courses.ideate.cmu.edu/18-090/f2019/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 jhueteso@andrew.cmu.edu – 18-090, Fall 2019 https://courses.ideate.cmu.edu/18-090/f2019 32 32 115419400 jhueteso – Project 2: LIDAR live interactive sequencer https://courses.ideate.cmu.edu/18-090/f2019/2019/12/07/jhueteso-project-2-lidar-live-interactive-sequencer/ Sat, 07 Dec 2019 22:32:14 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3305 For this project I wanted to try to do something interactive. The end goal was to have something people could use to make music in a cooperative and interactive way. An important requirement for this was responsiveness and intuitiveness. I wanted something where you could figure out what you were expected to do just by< moving things out.

For this project I used the LIDAR sensor I am currently using for my bachelor’s research project. A LIDAR sensor creates a 2D dimensional map of its by shooting a laser and waiting for the reflection (similar to a SONAR, hence the name).

2D map of a room
RPLIDAR - 360 degree Laser Scanner Development Kit
Sensor used

Now to play some music I divided the 360ยบ rotation on a series of “slices”. Each slice represents a time delay for a note (think on the number on a clock, each one is played when the hand hits it).

These slices are given a value encoding the distance to the first object. We now have nice angle and distances we can use in Max .

My patch can play drums or play music. In the drums part, the distance to the sensor control the type of drum played: hihat, snare & hihat, snare or kick. In the melody section the distance to the sensor controls pitch. The closer to the sensor the higher the pitch. I added two filters in parallel to add some user input to control the final sound.

Finally, I added an interface using to render a series of cubes that represent the note being played.

Link to Drive

]]>
3305
Project 1 – Playable MIDI stepper motor instrument https://courses.ideate.cmu.edu/18-090/f2019/2019/11/06/project-1-playable-midi-stepper-motor-instrument/ Wed, 06 Nov 2019 07:37:54 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3238 A stepper motor is a type of electrical motor that moves a very small distance (angular) every time it receives a pulse. They are notoriously loud when driven at audible frequencies. For this reason, manufacturers that use these motors for precision movement go to great length to avoid this.

I wanted to turn this very inconvenience into a musical instrument. Every stepper motor can play a single note (probably more with some convolution magic), so with three motors we can play many melodies as long as we don’t exceed this limit (more on this during my presentation). This is the setup:

The motors are attached to an acrylic plate to amplify the vibrations. Contact microphones are used pick up the sound and feed it into the speaker array and Max for visualization. The steppers are driven by two Arduino microcontrollers.

Max is used for formatting the MIDI/Serial packets and playing sequences but also for sound visualization. Using jit.mo and pfft I made a series of dodecahedron that change sizes according to the amplitude of the buckets. The color of these figures also changes pseudo-randomly every time a MIDI note is played:

For some reason there is some delay in the video. This doesn’t happen live….

Link to Drive

]]>
3238
jhueteso – Assignment 4 Cubey Galaxy https://courses.ideate.cmu.edu/18-090/f2019/2019/10/18/jhueteso-assignment-4-cubey-galaxy/ Fri, 18 Oct 2019 05:33:47 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3166 For this project I reused the cube example that uses the jit.gl.multiple object.

In the example we created in class, we used the output from the pfft patcher to modulate the size of the cubes being rendered in the jit.world. I have reused that idea but included a different multiple object. In this object render, the cubes rotate around a certain axis and the distance to said axis is controlled by the output of the pfft.

Link to Drive: https://drive.google.com/drive/folders/1QDnRvUICqXNfLWNwRO3WBuNb_FXPg5uf?usp=sharing

]]>
3166
jhueteso Assignment 3: Barks, flys & reverb https://courses.ideate.cmu.edu/18-090/f2019/2019/10/02/jhueteso-assignment-3-barks-flys-reverb/ Wed, 02 Oct 2019 04:52:50 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=3002 Original Audio:

The original audio used for this assignment is an extract of a Harry Potter and the Goblet of Fire. After the goblet chooses Harry’s name when he wasn’t eligible, Dumbledore inquires him:

(Mr. Potter wishes film’s Dumbledore was this chill)

Normal impulse responses:

Basketball Court:

Sound of a balloon popping in the UC Basketball court. Very nice reverberation from the large size of the room.

Lucky that no one was playing at the time
Output

Porter Hall’s staircase:

So, I really wanted to use a staircase. Hunt Library’s was off limits under risk of someone losing his job. I used Porter Hall’s staircase instead.

Output

Experimental Impulse responses:

Let’s get weird!

Fly buzzing:

I really hate the sound of insects flying next to my face, but I have to admit I like the result. I downloaded an audio file of a fly flying next to a microphone. In order to conserve some of the voice’s properties I used a very short snip and applied head fade in and fade out.

Output

The results sounds almost like the voice is whispering, Also, very creepy.

Dog barks:

This audio file has three dog barks, so obviously we are going to get three replicas of our initial signal.

Output

Link to drive

]]>
3002
jhueteso – Assignment 2: Trippy RGB https://courses.ideate.cmu.edu/18-090/f2019/2019/09/17/jhueteso-assignment-2-trippy-rgb/ Tue, 17 Sep 2019 22:37:06 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2879 When you first launch the patch nothing special happens. If you send the “open” message to jit.grab you will see your face in the floating window. Probably at something like 10 FPS…. Yeah this thing eats a lot of memory (now it’s actually running at 480*320 so it’s much better).

Go ahead and press the BIG button in the middle of the patch.

Now you have a total of 6 video channels (RGB + combinations) each delayed by a different amount. However, the magnitude of this delay changes. At first you might not even notice it or think it’s a glitchy patch. After 45 seconds the delay reaches it’s maximum and will start to get shorter and shorter. After another 45s everything will be back to normal.

You can press the button at any time to restart the effect.

Link to Drive: https://drive.google.com/drive/folders/11b3q3ZVYIcta1vwJuHKkreVsJlOaHagB?usp=sharing

]]>
2879
Jhueteso – Assignment 1: Ludwig phoning Beethoven https://courses.ideate.cmu.edu/18-090/f2019/2019/09/02/jhueteso-assignment-1-ludwig-phoning-beethoven/ Tue, 03 Sep 2019 01:40:29 +0000 https://courses.ideate.cmu.edu/18-090/f2019/?p=2741 Original material:

Beethoven’s 5th symphony is possibly one of the most recognizable pieces of music out there. It is scored for 22 instruments. These instruments give the symphony a broad range of frequencies. From the very high piccolo to the low bass. It is also important to note the high dynamic range.

System used:

In order to transmit many phone calls over the same channel, the human voice is transmitted using a little over 2kHz (a bit less than C5 to C7 for you musicians). Now a day these filtered signals are also digitized into a very small size and sampled at under 8kHz (that’s about a 5th the sampling frequency of a typical MP3). At the other end of the line, the sound is regenerated to complete the transmission.

Surplisingly this is enough to understand human speech. However, as anyone who has been put on hold, you know that music does not transmit well at all over the phone. This is because instruments and singers produce frequencies way the above mentioned range.

So, here is the question. What would happen to Beethoven’s 5th if we transmit it over and over through a phone line? How long will it take for the band-pass filter to swallow the high piccolo? Will the dynamic range survive the normalization?

How was this effect done?

An actual phone call was placed between two phones and both sides of the transmission recorded. Using both signals and an adaptive filter created using and FPGA (an intel Cyclone IV) we can recover the coefficients for an infinite impulse response filter that very closely mimics the telephone transmission line. For the last part, we send the signal through this filter (once again using the FPGA) and re-sample it using a microcontroller connected to our computer.

Max 8 was used for normalization and visualization of the resulting signal.

In retrospect, the second recording of the signal sounds better than the phone call signal. This leads me to believe that my method for recreating a phone line is not too realistic.

Google Drive link: https://drive.google.com/drive/folders/1ILeY6phiYcoI7TqMZHV6Ah8xn96e8Qsb?usp=sharing

]]>
2741