slakshma@andrew.cmu.edu – 18-090 https://courses.ideate.cmu.edu/18-090/f2017 Twisted Signals Thu, 07 Dec 2017 05:03:31 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.24 https://i1.wp.com/courses.ideate.cmu.edu/18-090/f2017/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 slakshma@andrew.cmu.edu – 18-090 https://courses.ideate.cmu.edu/18-090/f2017 32 32 115419400 Final Project – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/12/03/final-project-isha-iyer/ Mon, 04 Dec 2017 04:28:27 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1499 For my final project I decided to explore more ways of using the leap motion sensor to control different elements of drawing. I made a game through which the coordinates of a hand are tracked to translate to both a rotation and resizing of a square to match up with a target square. When the squares are matched sufficiently, it moves to another one. I have attached a demo of me playing this.

I was also very interested in learning more about different uses for the machine learning patch. I trained the leap motion to detect three different hand gestures: a palm facing down, a fist and a “c” for camera. As shown in the demo below, when I make a “C” with my hand, I am able to take pictures with the camera. I can then use my left hand to distort the image taken. This distortion was influenced by this tutorial.

Here is a link to all the final files I used for this project including the machine learning data and model for convenience. I also have included all the gists below.

Draw Game:

ML patch:

Distortion Patch:

]]>
1499
Project 1 – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/10/29/project-1-isha-iyer/ Mon, 30 Oct 2017 03:00:19 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1395 For project 1, I used a Kinect to control the motion of a particle system using my hand. I am very interested in different applications of motion tracking and I think this was a good introduction to help me learn how the Kinect works. Here is a download link to a video showing my project in real time: IMG_1479

I used this tutorial to help me create particles from an image that I would then control using input from the Kinect. To control the Kinect I used output from the object dp.kinect2. This took me a while to set up initially. I wanted to have the system use real-time image input from the Kinect as the image – that did not end up working quite like I wanted so I stuck to using one preset image.

Here is the gist for my code:

 

]]>
1395
Assignment 4 – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/10/15/assignment-4-isha-iyer/ Sun, 15 Oct 2017 16:33:07 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1289 I used the pfft object to add cool effects to an audio file based on a note clicked on the kslider. I then used the this tutorial as a template to create visuals for the audio.

Here is a link to view my video describing the project in box.

Here is my gist:

]]>
1289
Project Proposal 1 – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/10/01/project-proposal-1-isha-iyer/ Sun, 01 Oct 2017 20:40:29 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=990 I have two ideas for projects that I am interested in.

The first involves analyzing the frequencies and rhythms from an audio file to create special effects with the lights on the ceiling in the Media Room that correspond with the tempo and other elements of the audio.

After seeing the video in class last week of a saxophone sound begin constructed, I also became interested in learning how to use Fourier transforms to reconstruct the sounds of different musical instruments. I am not sure exactly how this would work, so maybe the first idea would be more doable for Project 1.

]]>
990
Assignment 3 – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/10/01/assignment-3-isha-iyer/ Sun, 01 Oct 2017 20:27:41 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=971 I used a recording of Prokofiev’s Romeo and Juliet Suite  by the London Symphony Orchestra as my original signal. The two IR “pop” recordings I used were taken outside CFA and in the CFA hallway. There did not seem to be much difference between the two resulting signals in these recordings. The main difference was the volume of the sound convolved with the pop taken outside CFA was much quieter than that of the CFA hallway. This does not seem to come through on this post for some reason.

Original Signal:

CFA exterior followed by convolved original:

CFA hallway followed by convolved signal:

The other two recordings I took were of wind while standing outside and water being poured out of a water bottle into the sink. I then experimented with recordings of fire and glass that I found online.

Fire:

Glass:

Water:

Wind:

The resulting signals after convolving with glass, water and wind added interesting effects to the original piece. Convolving with fire turned the original signal into cacophony.

Here is the modified version of 00 Convolution-Reverb I used to do this assignment. I had kept all my impulse responses and original signal in a folder named “impulses”.

 

 

]]>
971
Assignment 2 – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/09/17/assignment-2-isha-iyer/ Mon, 18 Sep 2017 00:39:05 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=843 I used the concept of time shifting by delaying a sine wave to demonstrate noise cancellation properties. By changing the number of samples by which the wave is delayed, we can shift it enough so that the shifted wave added to the original becomes 0. My code and a demonstration of the patch are shown below.

 

]]>
843
Assignment 1 – Isha Iyer https://courses.ideate.cmu.edu/18-090/f2017/2017/09/02/assignment-1/ Sat, 02 Sep 2017 20:45:30 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=643

I used two phones to distort an image I took a couple years ago. Using one phone, I took a picture of the image displayed on the second phone. Then, I used AirDrop to quickly transfer the file to display the new image on the second phone and repeated the process until the image became mostly one color after 50 iterations.

]]>
643