bumble_b-AugmentedBody

Scotty Dog Simulator

 

I started this project pretty late, so I knew I had to stick with something pretty simple. I had a few ideas about making filters of School of Drama professors and their iconic caricatures, but I don’t really like them and hate the idea of idolizing them. Plus, it wouldn’t really land well with our class who doesn’t know them. I was pretty set on doing a filter, since I had a feeling it’d be simple, so I thought a little longer and landed on our adorable scotty dog mascot, keeping with the Carnegie Mellon theme I originally had!

Since I really want to learn how to make games (and also needed to add some complexity to a very simple idea), I added a little start screen and some bagpipe background music (that ended up stuck in my head for like a whole freaking day).

I also decided to add a bark sound every time the user opened their mouth, which I accomplished by calculating the distance between a point on the top lip and a point on the bottom lip. That’s definitely my favorite part of the project now.

Something I didn’t think about, and therefore did not give myself enough time to implement, was how to scale the filter based on the distance the user is to the screen. Also, if they turn their head, the filter does not rotate with them. When I got the feeling something was wrong, I opened Snapchat and experiemented with their filters, seeing how they scaled and rotated perfectly! That’s definitely where this project has fallen short, and I wish I started earlier to give myself that time.

Here is an early process photo where my scotty dog kind of looked more like a cat than a dog…

 

starry – AugmentedBody

My project utilized the limbs and body to control the movement of a forest. I wanted to explore movement in inanimate objects, not just a single tree, and I think with natural subjects it’s easier to build complexity visually since I could copy paste the trees to create a forest. By moving the arms back and forth, the user can simulate movement of branches, and their distance from the camera determines the size of the sun / visibility of the ground.

I liked how it turned out visually but I didn’t like the user interaction since the frame rate was pretty bad and it caused the movement to look very laggy. I also wanted to process the user’s movement somehow to create smoother looking movement, similar to wind moving through trees, but couldn’t figure out how to do so.

video

merlerker-AugmentedBody

My project is quite simple and silly: a “nose isolator” that finds your nose and masks everything else. Bodies are strange, and I appreciate projects that acknowledge that universally-felt, awkward but intimate relationship we have with our bodies, like Dominic Wilcox’s “Tummy Rumbling Amplification Device” [link] and Daniel Eatock’s “Draw Your Nose” [link]. Isolating the nose has the effect of forcing you to confront a body part that you’ve probably felt self-conscious about at some point in your life, and allowing it to become an endearing little creature in itself. Though it’s a simple project and treatment, I feel it’s successful in creating a delightful and different relationship with your body. I’m proud of the conceptual bang-for-buck: it’s an important exercise for me to let go of perfection and overambitious projects that never end.

Originally I was trying to apply the nose isolator to scenes from films, but got frustrated trying to get handsfree.js to run on a <video> source and doing the correct mirroring and translating to get it all to line up. I instead created a performance using my own nose that leans into the nose-as-creature.

kong-AugmentedBody

My initial idea was to represent the homesickness I was feeling by utilizing the distance between one’s face and hand: the face would represent self and the hand would represent home. I wanted to play with the distance between the face and the hand to foster different interactions. 

While playing around with the connections, I came across the idea of recreating the period cramps I feel as I am currently on my period. Based’s on one’s face and hand movement, various lines stretch, shrink, and strangle with one another. When another hand enters, the lines switch over to the other side, representing the sudden pains that occur.

As this was a quick project, I believe that there is a myriad of ways to extend this project. For instance, I could also bring in points from the body to not only create more interactions but also present more diverse forms. Further, instead of utilizing lines, I could incorporate shapes such as triangles and make use of gradients to build a 3d object.

duq-AugmentedBody

 

In this project I tried to make a program that would track the positions of your hands (using the code given to us) and track generative fire onto your fingertips on the screen. I used several different factors to try to generate realistic fire that determined how rapidly it would ascend, how spread out the fire would get and how much smoke it produced. Overall, I am happy with how I was able to convert the idea behind my project into a reality, but the program does run very slowly, making the fire move upwards far more slowly than real fire would. I tried to fix this issue by using pixels instead of circles, but it did continue to run slowly. Something that I wish I could have added was the option to only have fire coming out of your index finger if you close your other fingers as this would allow you to draw with the fire, but I really did not know how to begin to implement this.

Sneeze-AugmentedBody

My concept for the Augmented body project was to have an altered way of speaking. I wanted for people to type in their words and have the words come out of their mouth without them having to actually voice themselves. I wanted to make this because sometimes writing or typing out words is easier than saying them. When you open your mouth and press a letter on the keyboard, multiple of that letter will float out of your mouth.

I wanted to have the things floating out of your mouth be words, but I ended up doing single letters instead. As I started coding for words, I ran into lots of problems (that probably could be fixed if I had more time). I was not able to get full words as input since I had a difficult time figuring out input text boxes combined with the web screen display (every time I had the input text box on screen, it would cover the webcam and vise versa). I settled for detecting single letters pressed and outputting that instead of full words. Now if the user wants to type a word, they must view the word by reading the floating letters from farthest from the mouth to closest (so disjoint letters).

spingbing-AugmentedBody

My artwork is a piece of software which gradually covers the face in jittery dots which are meant to act as a method to blur the face into obstruction. I used HandsFree.js to achieve this code.

Unfortunately, I did not have the time or means to do my performance. I am planning on borrowing lights from the 3rd floor lending office and using an empty room to record this piece. My plan is for the video to start at complete darkness and slowly increase the light on my face at an angle at the same rate in which my face is obstructed, creating a sort of dynamic contrast between the “uncovering” of my face with the light vs the anonymization of the filter. The video would be less than 30 seconds long.

When finished, I think this piece will be fairly strong due to its conceptual implications. My interpretation is that it has to due with performative activism or representation done by the broader media.

Koke_Cacao-AugmentedBody

Play it here: https://kokecacao.me/page/Course/S22/60-212/code/p5/dino/index.html

(Note: No sketches were created in the process of making this project)

Google Dino but play by blinking eyes. The purpose of this remake is to force the player to close their eyes in a game that requires actively looking at the screen. It also requires players with subconscious control over their eyes that are semi-automatic in daily life. By playing this version of Google Dino, players get the opportunity to exercise their eyes during a brief internet disconnection perhaps between long periods of staring at the screens. (Note: the program also has voice input to control dino’s ducking. However, I did not have the opportunity to test it.)
The improved blink detection is dynamically calibrated so that different devices can use it with little performance difference. The graphics are borrowed from a public repository (with slight modifications) to closely resemble the original version of Google Dino.

Solar-AugmentedBody 

https://youtu.be/9mAKcOUulHs

I wanted to play with weightless, fluid visuals that respond to one’s movement, something that can enhance the lightness of a dancer’s movement. Hence, I played with visualizing particles that follow specific points of the body: hands, feet, knees, elbows, and the head. I am happy with the vague silhouette that is visible when one walks around and also the abstract fluid forms that result from the swinging of an arm or kick of a leg.