Kade Stewart – Project 12 – Proposal

I would like to create a game where you are a paper airplane. Because you have no on-board propulsion, you are constantly trying to fly through hoops that give you more energy. Beyond this, you go through levels that each have their own quirks. One level might require backwards controls, and another might require you to mash buttons instead of actually pressing the arrow keys. There might be some obstacles for the player to dodge, and their will probably be some meters on-screen saying how much energy you have left and how far you’ve gone.

Beyond this, I would like to show the data collected from keypresses, just because I think this would be interesting. It might give the player insights as to how they can do better, if they’re pressing keys too much, etc.

Connor McGaffin – Proposal

For my final project I am interested in exploring the intersection of physical aesthetics and music playback. When I go to concerts, one of the largest factors in my opinion of the performance is the quality of the visuals used to accompany the artists. I am interested in making this sort of experience more accessible through a music player. I will likely have several different songs to choose from and different visuals to accompany them. I would still like to ground the visuals in some literal manifestation, so I am considering using an interface which resembles a turntable.

Sometimes at parties I see people use their Chromecast or Apple TV to stream music on, which only displays the album cover and song title. I imagine this program to be used in this sort of social setting, where it is not necessarily the primary focus of the occasion, but accompanies the energy in an appropriate fashion.

Below is a rough digital sketch of what the program may look like. Centered is an abstract representation of a record player, surrounded by graphics that accompany and supplementally visualize the music.

Rjpark – Project 12 – Final Project Proposal

For my final project, I wanted to create something that is personally interesting to me. So, I’ve decided to create a visual and interactive computer keyboard dance generator. The objective of my project is to allow the user to see a dance move that’s generated based off of the key they pressed. As you can see below, if the user presses the “f” key, he or she will see dance moves surrounding footwork.

For now, I plan on creating one move per key for many keys. From there, I will try to create multiple moves per key so that when the user presses the key, the moves are randomized. This will allow for more diverse dance moves to be generated.

Lan Wei-Project 12-Proposal

What I want to do for the final project is something about music but also has visual effects. I want the project to be interactive, meaning that people can create their own music (probably unconsciously). The detailed effect that I’ve imagined is that in the canvas of ‘universe’, people can create planets every time they click, and each zone of the canvas is related to a related piece of rhythm. By clicking in different areas of the canvas different sound effects are created. The visual effect of the planets needs some planning. I’m thinking that when a point is clicked, some repulsion is generated from this point and shapes are pushed away from the point, and thus a planet is generated. It would be nice if the planets can rotate from its position in a 3D mode and also oscillate with the volume of the rhythm. Other effects might be added to make the project more interactive and playful. I’m really looking forward to it.

Curran Zhang – Project 12- Proposal

As an architecture student, I wanted to do something that involves architecture information. As an architecture student, we usually try to find precedent studies based off of a certain idea. With a collection of different ideas of architecture like green features, atrium, cubic forms, and landscape design, students can click to see further information. With each collection, I plan to have an animation that can draw out an iconic building that represents that idea. this would require an archive that has different design ideas and show some sort of information that can help architecture students like myself. The picture below shows diagrammatic representations of buildings drawn by Fedrico Babina. Each drawing is a different representation of works done by other artist like Andy Warhol and Mark Rothko. I want to do something similar but combining iconic building, animation, and information together.

Art meets architecture in Federico Babinas Archist Series
Work by Fedrico Babina

Shirley Chen-Final-Project-Proposal

For my final project, I want to create an interactive game for the players to create music with different combinations of instruments, beats, and sound effects. The players can select and manage the number of layers of sound  and the types of sound they want to put in to the performance. It will involve the previous lessons about loading sounds.  I got the inspiration from a music app that I introduced in Looking Outward called incredibox

Project 12: Final Project

This summer, I got sucked into a web development hole and found a couple examples of item parallaxing that i found extremely exciting. I thought i would take this opportunity to completely craft it in JS and create a paper cutout effect. This is intended to just be a page where the user will scroll around and the image changes perspective. I want to do something similar where it seems these sketches are on different planes as the user interacts with the webpage. Essentially that means moving each type of sketch at different rates. Depending on how this translates to code, I’m going to try to recreate these sketches and apply effects on them so they look like cutouts of paper.

example of images on the canvas

An additional goal would be to try to get it to react to mouseX and mouseY to control the parallax effect.

Alice Fang – Project 12 – Proposal

I am planning to work with my classmate Jaclyn Saik to create an interactive poem. We plan to use one of our favorite poems, “Still I Rise” by Maya Angelou, not only because it’s an excellent piece of writing but also because her message feels especially pertinent in today’s political and social climate. The poem is 43 lines and 9 stanzas long, and we plan to figure out a way to break it up and display it on separate slides, which the user can move through as they continue to read and interact. We want to create interactions specific to the different lines (or couplets, or stanzas). For example, the line “I’m a black ocean, leaping and wide,/Welling and swelling I bear in the tide”, we plan to animate the text based on the mouse position to imitate waves.

We were inspired by the work of programmer and poet Allison Parish, who creates a lot of work involving interactive text and generative poetry.

Some sketches and storyboards for the ways users can interact with the lines of text

Mimi Jiao – Project 12 Proposal – Section E

Sophia Kim and I plan on collaborating for this final project. We want to further explore interactive sound implementation and WEBGL. We started off exploring currently existing work and discovered code that integrates sound and visuals together. They utilized the frequency and amplitude of imported songs to alter the imported image. We started off looking at how static imported images are altered based off of sound with this video and we branched off static images by looking at more dynamic and generative shapes through WEBGL. We found this interactive particle to be really interesting and cool and we definitely want to play around with geometries. Since our current skillsets are not developed enough to create something as complicated and fleshed out as this particle equalizer, we want to stay confined to shapes like ellipses, boxes, and custom shapes generated by basic math functions like sin, cos, and tan. From this, we want to play around with the idea of interacting with multiple human senses to create an experience. The audience is able to have a more heightened experience because of the mix of visuals and audio. The use of sound can make visuals easier to comprehend. In a way, the visuals will almost be a method of data visualization of the structure of the song. 

Sophia Kim – Project 12 – Proposal – Sec C

Mimi Jiao and I plan on collaborating for this final project. We want to further explore interactive sound implementation and WEBGL. We started off exploring currently existing work and discovered code that integrates sound and visuals together. They utilized the frequency and amplitude of imported songs to alter the imported image. We started off looking at how static imported images are altered based off of sound with this video and we branched off static images by looking at more dynamic and generative shapes through WEBGL. We found this interactive particle to be really interesting and cool and we definitely want to play around with geometries. Since our current skillsets are not developed enough to create something as complicated and fleshed out as this particle equalizer, we want to stay confined to shapes like ellipses, boxes, and custom shapes generated by basic math functions like sin, cos, and tan. From this, we want to play around with the idea of interacting with multiple human senses to create an experience. The audience is able to have a more heightened experience because of the mix of visuals and audio. The use of sound can make visuals easier to comprehend. In a way, the visuals will almost be a method of data visualization of the structure of the song.