sarikab@andrew.cmu.edu – 18-090 https://courses.ideate.cmu.edu/18-090/f2017 Twisted Signals Thu, 07 Dec 2017 05:03:31 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.24 https://i1.wp.com/courses.ideate.cmu.edu/18-090/f2017/wp-content/uploads/2016/08/cropped-Screen-Shot-2016-03-29-at-3.48.29-PM-1.png?fit=32%2C32&ssl=1 sarikab@andrew.cmu.edu – 18-090 https://courses.ideate.cmu.edu/18-090/f2017 32 32 115419400 Project 2: Swiss Design Poster Installation – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/12/04/project-2-swiss-design-poster-installation-sarika-bajaj/ Mon, 04 Dec 2017 05:37:30 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1491 For this past semester, I have been conducting a research project under Prof. Susan Finger to install projection systems around the IDeATe Hunt Basement to create a platform for students in the animation, game design, and intelligent environment minors to publicly display their work. Therefore, my projects for Twisted Signals revolved around creating demos for specifically the interactive projection system using Max. My first project, a virtual ball pit, was a good exercise in learning on how to use the Kinect but was not really a conceptually heavy demo. Therefore, for my second project, I wanted to make a system that would actually teach the users something.

The concept that I settled on was to make a system that allowed users to interact with the Hunt Swiss Poster collection, an extensive set of extraordinary Swiss design posters that are housed in the Hunt Library which very students know exist. Originally, I had planned on using the Kinect to allow users to “draw something” using a colored depth map that would then get processed to display the closest Swiss design poster. However, in my early protoyping, it was starting to get apparent that the interaction was not as obvious as it could be, which was leading to a weaker installation. Moreover, as I have had to borrow all of my equipment from IDeATe for every project, I ran into the issue that every Kinect and my specific computer was checked out for the time span that I needed to work on this project. Therefore, I had to pivot.

While planning the projection installation, we were hit with the news that the Kinect was no longer going to be produced. As I was forced to work without a Kinect anyway, I decided to work on creating an interesting interaction with just an RGB camera which thankfully will probably always be produced. Additionally, I realized that, although being a far more difficult path, the best possible way for users to interact with these Swiss posters was to be a literal part of them, which would mean every single poster would have to be designed uniquely. However, this direction would also result in an avenue where several students could choose to participate in this project if they are lacking in their ideas for projects.

Therefore, for my Project 2, I created two different Swiss poster exhibits as well as a very simple UI that an IDeATe staff member would use when turning on the projection system each morning. Each exhibit has an interaction display that mimics a Swiss poster design that is placed next to the original Swiss poster, some information about the poster, and some information about the project.

First Exhibit: 

Second Exhibit: 

UI Snapshot: 

Gist of Code: 

]]>
1491
Project 1 – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/10/29/project-1-sarika-bajaj/ Sun, 29 Oct 2017 22:13:33 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1379 This project allows a person via the Kinect to use their hand to move around balls in a virtual ball pit. Much of this patch has been built upon some of the dp.kinect2 reference patches as well as a reference from https://cycling74.com/tutorials/00-physics-patch-a-day, integrating the two by creating a kinect system that uses the closest player’s right hand to move around the main movable physics force. Most of the work in this project involved figuring out what good bounding boxes would be in the physical world and in the virtual world/how the user would actually interact with the Kinect (I wanted the output animation to be very obviously user controlled – almost painfully so). Additionally, I had some fun changing the aesthetics of the actual ball system.

Video of the system working:

Gist of code: https://gist.github.com/anonymous/9ddab8deb04b40090d8efeb8cd0b5f06

]]>
1379
Assignment 4 – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/10/15/assignment-4-sarika-bajaj/ Mon, 16 Oct 2017 02:43:58 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1308 The goal of this patch was to create a rendering that would react to the amplitude of the microphone’s input. I made this by originally looking online for a sample of a rendering tutorial I found interesting located at https://www.youtube.com/watch?v=qf1OGUeIs1s. I originally started off with the video’s original patch, removed all the audio processing they were doing in the patch. After then playing around with the rendering part of the patch that I had kept, I changed the noise type of the rendering, as well as the scale and appearance of the rendering. During this process, I discovered the “distortion” input that this rendering originally had set to a fixed value, and decided that this was the input I wanted to be dependent on the amplitude of the audio input (as it was giving an interesting zoom effect). Thus, I wrote my pfft to be filtered and then have only the amplitude passed out which would then be scaled down to act as my distortion input.

For this example video, I simply used ambient noise as a catalyst (people walking by and talking) as I’m interested in making renderings that will use ambient noise/images from an environment in a way that is obvious yet still interesting. Unfortunately, the Youtube compression ruins the effect quite a bit, but the general visual is preserved. A Google Drive link to the video is located here: https://drive.google.com/open?id=0Byn46tolhCwUUlNzNDVObGppY1k

Github Gist Here: https://gist.github.com/anonymous/f69fd0c33650aeab618f81ad8d37ecfe
*** When I tested the compressed code just for checking to make sure my file was all right, for some reason the rendering just stays stationary while on my actual code it is working fine. For this reason, I also am uploading a zip file of my files, just in case something messed up on the copy compressed feature for some bizarre reason.
Zip of Files: Assignment 4 – Sarika Bajaj

]]>
1308
Assignment 3 – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/10/01/assignment-3-sarika-bajaj/ Mon, 02 Oct 2017 03:45:53 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1075 For my assignment 3, I convolved a series of impulse recordings with the quintessential audio clip of Dobby (the house elf from the Harry Potter series) being freed from his master (which I obtained a twenty second clip of from SoundBoard.com).

Original Audio Clip (Dobby is Free):

My two acoustic space IR recordings were taken by recording the audio of a balloon popping (with its accompanying reverb). The locations I chose to record were the women’s bathroom in the basement of Baker hall and the overlook present on the second floor of Baker Hall.

IR from Women’s Bathroom in Baker Basement:

Convolution of Dobby and Women’s Bathroom IR:

IR from Overlook in Baker:

Convolution of Dobby and Overlook in Baker:

I then explored a little bit of what the Dobby recording would sound like when convolved with common soundtrack noises that might be present in a bad movie. Specifically, I chose gurgling water, seagulls cawing, applause, and crickets soundtracks that I downloaded from SoundBible.com.

Gurgling Water IR:

Convolution of Dobby and Gurgling Water:

Seagulls IR:

Convolution of Dobby and Seagulls:

Applause IR:

Convolution of Dobby and Applause:

Crickets IR:

Convolution of Dobby and Crickets:

After making these samples, I started exploring some of the built-in Max examples and ran into one named “convolution workshop.” A bit curious about what it would do, I merged our original convolution reverb patch this patch. Specifically, I pushed the “Dobby is free” audio and the “applause” IR through the original convolution and then pushed the result into a source filter convolution with the “Dobby is free” audio again. The result sounds significantly more noisey than the previous results.

Further convoluted Dobby and Applause:

The final Max patched used to create the last audio piece can be found here: https://gist.github.com/anonymous/37367ee3fca07b4a057610ee7ff6630d

 

]]>
1075
Project 1 Proposal – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/10/01/project-1-proposal-sarika-bajaj/ Mon, 02 Oct 2017 02:32:39 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=1054 For my first project, I would like to use the Kinect, Max, and a projector to create a generative art piece that enables some sort of live interaction. Preferably, I would like to create a piece similar to the one depicted in this picture:

Depending on whether particle interaction, shape manipulation, etc. is easier for me to bring up, I might pivot the project more to one of those directions. However, as a minimum, I would like to be able to create a system that creates a live time interactive piece with the Kinect and Max.

]]>
1054
Assignment 2 – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/09/17/assignment-2-sarika-bajaj/ Sun, 17 Sep 2017 20:46:46 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=809
For my time shifting assignment, I decided to make a “horror movie” webcam filter which takes in a webcam image which 1) plays a normal video until it time shifts 700 frames back for about 100 frames (to give the unsettling video playback effect that some horror movies have) 2) turns the RGB values from the webcam into pure luminance values and then filters the luminance value to create a grainy/distorted image .

Gist of code: gist:6f11dc358eabb3a5d7dc3f2dba39f493

]]>
809
Assignment 1 – Sarika Bajaj https://courses.ideate.cmu.edu/18-090/f2017/2017/09/05/assignment-1-sarika-bajaj/ Tue, 05 Sep 2017 21:56:44 +0000 https://courses.ideate.cmu.edu/18-090/f2017/?p=717 Over the years, Amazon has given me quite a few interesting results under the “Customers who bought this item also bought” tab, some suggestions that have resulted in me impulsively buying more items online and others that have left me just plain confused. Therefore, I thought it would be interesting to explore this suggested purchases feedback loop.

As I have been recently setting up my new townhouse, I thought I would start with a simple search on Amazon for “office chair.” I then recorded the first suggested item of the list and then clicked on it to create the chain. If I encountered a situation where the suggested item looped, I simply recorded the next in line suggested item that I had not previously clicked on and continued down the chain. I followed the chain to about 50 items, by which time I had not seen an office chair for quite a while.

The video of the path across Amazon is below:

 

]]>
717