Rjpark – Project 12 – Final Project Proposal

For my final project, I wanted to create something that is personally interesting to me. So, I’ve decided to create a visual and interactive computer keyboard dance generator. The objective of my project is to allow the user to see a dance move that’s generated based off of the key they pressed. As you can see below, if the user presses the “f” key, he or she will see dance moves surrounding footwork.

For now, I plan on creating one move per key for many keys. From there, I will try to create multiple moves per key so that when the user presses the key, the moves are randomized. This will allow for more diverse dance moves to be generated.

Lan Wei-Project 12-Proposal

What I want to do for the final project is something about music but also has visual effects. I want the project to be interactive, meaning that people can create their own music (probably unconsciously). The detailed effect that I’ve imagined is that in the canvas of ‘universe’, people can create planets every time they click, and each zone of the canvas is related to a related piece of rhythm. By clicking in different areas of the canvas different sound effects are created. The visual effect of the planets needs some planning. I’m thinking that when a point is clicked, some repulsion is generated from this point and shapes are pushed away from the point, and thus a planet is generated. It would be nice if the planets can rotate from its position in a 3D mode and also oscillate with the volume of the rhythm. Other effects might be added to make the project more interactive and playful. I’m really looking forward to it.

Curran Zhang – Project 12- Proposal

As an architecture student, I wanted to do something that involves architecture information. As an architecture student, we usually try to find precedent studies based off of a certain idea. With a collection of different ideas of architecture like green features, atrium, cubic forms, and landscape design, students can click to see further information. With each collection, I plan to have an animation that can draw out an iconic building that represents that idea. this would require an archive that has different design ideas and show some sort of information that can help architecture students like myself. The picture below shows diagrammatic representations of buildings drawn by Fedrico Babina. Each drawing is a different representation of works done by other artist like Andy Warhol and Mark Rothko. I want to do something similar but combining iconic building, animation, and information together.

Art meets architecture in Federico Babinas Archist Series
Work by Fedrico Babina

Shirley Chen-Final-Project-Proposal

For my final project, I want to create an interactive game for the players to create music with different combinations of instruments, beats, and sound effects. The players can select and manage the number of layers of sound  and the types of sound they want to put in to the performance. It will involve the previous lessons about loading sounds.  I got the inspiration from a music app that I introduced in Looking Outward called incredibox

Curran Zhang – Looking Outwards -12

For this looking outwards, I was more interested in interactive art works that uses the human and computational design to create something new. Projects that I was interested in was the Digital Type Wall by SEA Design (2012) and Vanishing Points by Rafael Lozano- Hemmer(2018). Digital Type is an animation sequence that changes the array of letters into different font types. Out of 6000 possible combinations, one is chosen at random. This allows visitors to observe the changes and various types of “language” created by the same letters. Vanishing Points is an interactive art piece that changes the vanishing point of the drawing based on the location of the closest viewer. These projects gear me towards something that is more of an artistic game or interactive program that allows the user to create amazing drawings.

 

 

http://www.lozano-hemmer.com/vanishing_points.php

http://marcinignac.com/projects/digital-type-wall/

Alice Fang – Looking Outwards – 12

Andrea Gysin’s website is an inspiration for my and Jaclyn’s proposed project. Her website is full of examples of interactive type, which flow across the screen and shift with the position of the mouse. Other than the changes with mouseX and mouseY in the background, the body of text left aligned on the page flickers through and ‘rotates’ when the mouse hovers, creating a really cool loading characters effect. Andrea Gysin’s other work beyond just the construction of her website also include a program created for graphic designers to build simple, animated alphabets, and other tools to create visuals and installations. A lot her work is inspiring and along the lines of typographic interaction that Jaclyn and I are trying to build.


A project along similar but slightly different lines is Amnon Owed’s CAN Generative Typography. Using processing, he created an alphabet of generated letters, with different graphic characteristics. While it does not deal with bodies of text, as Andrea Gysin’s website does, the generative part of this video is what I find interesting; it would be really cool if we could apply a generative aspect to how the lines or stanzas of the poem for our project appear onto the canvas. Owed’s alphabet is not interactive, but a hybrid with the interactions seen in Gysin’s website could produce results that we want.

CAN Generative Typography from Amnon Owed on Vimeo.

A demo video of his generative typography

Project 12: Final Project

This summer, I got sucked into a web development hole and found a couple examples of item parallaxing that i found extremely exciting. I thought i would take this opportunity to completely craft it in JS and create a paper cutout effect. This is intended to just be a page where the user will scroll around and the image changes perspective. I want to do something similar where it seems these sketches are on different planes as the user interacts with the webpage. Essentially that means moving each type of sketch at different rates. Depending on how this translates to code, I’m going to try to recreate these sketches and apply effects on them so they look like cutouts of paper.

example of images on the canvas

An additional goal would be to try to get it to react to mouseX and mouseY to control the parallax effect.

Looking outwards 12: Parallaxing

I’ve always been interested in Javascript features on websites so I’m going to try to create a parallax effect type of papercut artwork.

I bring up this example because parallaxing creates an illusion of depth that I think very much enhances the digital experience. (click 0.25x to get the full effect)

As seen here, the artwork is directly interactive with the user’s mouse. I like this because it is a subtle interaction that changes the feeling of the entire webpage.

See the Pen Papercut effect – mouse parallax by Julien Barreira (@JulienBarreira) on CodePen.

Both of these examples were developed with a mixture of html,css, and js. I’d love to see if I can compute this in pure Javascript.

Sophia Kim – Looking Outwards 12 – Sec C



“BAD SIGNALS” and “FUZZY BLOB” were both created in the beginning of 2018. Both projects are created by Avseoul. While both projects use webgl, “BAD SIGNALS” uses the webcam as a part of their visuals, and “FUZZY BLOB” uses the microphone to make audio. I noticed the use of vibrant colors throughout both projects. I admire how the sound transitions are shown visually through change of colors, because they are bold and noticeable. For “BAD SIGNALS,” I noticed how the glitches are responsive to sound. I admire how the glitches are not subtle, but rather exaggerative to show change. “FUZZY BLOB” allows the user to interact with the webgl not only with realtime audio, but also with mouse movement (can make affects and indents on the ‘fuzzy blob’). Similar to “FUZZY BLOB,” “BAD SIGNALS” utilizes realtime audio to make glitches on the visuals, which is from the web camera. I admire both projects, because they get the user to interact with the visuals and audio. Also, these projects depend on the user’s interaction (ie the sounds made by the user/their environment, their mouse movement).

Mimi Jiao – Looking Outwards 12 – Section E

Glitched image generated by pixel relocation via sound

I stumbled upon user avseoul on Vimeo while looking for the final project inspiration. Many of their works are create using Unity 3D and/or creative code to combine real time audio or video to alter the existing visuals on the screen. Avseoul uses a lot of organic shapes that mimics cells, mountains, and water droplets. Added with interesting textures and colors, they create really interesting graphics by integrating sound and visuals. I am really intrigued their glitched image by sound  and Audio Reactive Slit Photo-Scan Test where an image is taken and constantly reprocessed and altered based on a song that is playing. As the song continues to play, the image becomes more and more distorted and it’s an interesting way of visualizing the changes throughout the song. These alterations are almost a way of visualizing audio and it almost crosses into the domain of data visualization. I find this a really interesting starting point for one to explore deeper into ways information can be visualized. One thing I would like to see from this piece is how it translates into a three-dimensional space. I think it would be really cool if these changes were incorporated on the Z axis. These works provide a really interesting starting point for Sophia and me and we hope to branch off of these ideas and explore further into WEBGL.