Sarah Yae – Looking Outwards 12 – Section B

Regarding the project “Drum Kit” (http://ronwinter.tv/drums.html), this is an computational, interactive music project that plays drum-related sound that correlates with a specific key of the keyboard. I admire that it is under a certain theme (of drums), but I feel like there could have been volume control of the general program, so that users could put a background music and use “Drum Kit” at the same time.

Drum Kit

Regarding the project “Rave” (https://rave.dj/mix), this is a website that allows users to mash up two songs that are uploaded. It’s pretty cool in a way that it generates smooth mashup of music, but it would have been improved by allowing the users to change the mashup of songs, instead of automatically generating it.

Rave

The two above projects are very different from each other even though they both utilize music to create something new. Rave focuses more on using existing songs to create a new song, while Drum Kit focuses on using different sounds to create a song.

Hannah Cai—Project 12—Proposal

For my project, I’m planning to do some kind of interactive, generative audio visualization. My most concrete idea as of now is to create a field of particles in WEBGL, with each particle having a sine wave of a random frequency embedded in it. Users would be able to “look around” the viewing field with orbit control, hover over a particle to hear its pitch, and click and drag between particles to form “links.” Linked particles would emit their pitches, allowing the user to build “constellations” of harmonies. I want the particles to feel dynamic and alive, so I’ll probably implement some sort of noise into their movement, as well as create reactive interactions, such as glowing when a mouse is over a particle, or when a particle is linked. This idea will probably require the use of objects, which are one of my weak points. Hopefully, completing this project will strengthen my familiarity with objects. I’m also almost completely unfamiliar with WEBGL, but I’ve been interested in it for a while, so this will be a good opportunity to explore it.

If I have time, I’d want to take this idea further and try to gamify it in some way, as well as add more recognizable visuals into it (for example, have the particles be a night sky above a hill).

Elena Deng- Final Project Proposal

For this project, Dani Delgado (from Section E) wanted to collaborate and create something that synthesizes user interaction, sound, and generative visuals. Our plan is to start by displaying a still image (we are thinking a graphic cube) which changes a little bit when the user presses certain keys. The whole interaction would work as so: Each key has an assigned sound and “cube distortion” to it. So, when the user presses a key, a note will play and the visual will change slightly, so that when the user plays a full “song” the visual would have morphed a lot. A possible idea we had was with lower pitches the visual could retain the change that the lower pitch enacted, while with a higher pitch, the visual would go back to its original state before the key was enacted.

Sketches depicting different stages of cube

Cube drawing!

Dani Delgado – Project Proposal

For this project, we (Elena Deng and Dani Delgado) wanted to collaborate and create something that synthesizes user interaction, sound, and generative visuals in order to create an interesting digital experience. Our plan is to start by displaying a still image (we are thinking of it being a graphic cube but are currently entertaining other possibilities) which changes a little bit when the user presses certain keys. The whole interaction would work as so: Each key has an assigned sound and “cube distortion” to it. So, when the user presses a key, a note will play and the visual will change slightly, so that when the user plays a full “song” the visual would have morphed a lot. A possible idea we had was with lower pitches the visual could retain the change that the lower pitch enacted, while with a higher pitch, the visual would go back to its original state before the key was enacted.

A possible way for the cube to change
another possible way the cube would change

Looking Outward 12, Erin Fuller

The first project, Orbicular Geode Puzzle, I found was made by Nervous System. It’s a puzzle that is computer generated to represent a slice of an algorithmic geode. Each puzzle is unique, emerging from a computer simulation that creates natural variations in the shape, pieces, and image. The way the puzzle is cut, a dense, maze-like pattern with extreme intertwining and high piece count, makes it an extremely hard puzzle to solve.

Orbicular Geode Puzzle, Logic

It is similar to my final project because I’m making a generated maze. Though I probably won’t be able to make something as complex as this, it is a nice project to see how beautiful a generated puzzle can be.

Orbicular Geode Puzzle

My second project, while not a highly technical project, explores mazes even more. Robert Morris, American sculptor, conceptual artist, and writer, is regarded as one of the most prominent theorists of Minimalism. The Philadelphia Labyrinth was a site-specific art installation. While this is still a contemporary project, I think it’s interesting how much labyrinths, mazes, and puzzles have been a part of the human consciousness since ancient times.

dedalici_3.jpg
Philadelphia Labyrinth, 1974

 

Anthony Ra – Looking Outwards 12

I already briefly mentioned two projects that I want to talk about in my proposal, and I will go much more in-depth with these projects.

The first project is the gaming of all of the Super Mario series by Shigeru Miyamoto. What inspires me is the simplicity of the design of the character, the landscape, the premise and the way it codes yet it is popular to the wider audience globally.

The part in this video I want to emphasize is the 2nd part of Miyamoto’s “genius” in work, which is the simplicity. I don’t have to write words like what I am doing right now on instruction on how to play a certain game with the use of design subtlety and positioning. With minimal objects the user can see which direction to go and what the obstacles are; in Mario, the goombas and koopas move in the direction towards him, signaling that it is harmful.

Flappy Bird

The second precedent is a game with similar graphics called “Flappy Bird” developed by dotGears. What makes this game intriguing is how some of this code was programmed in ways that are very similar to what I have learned throughout this semester. One of the more obvious assignments and projects regarding this is the generative landscape, where various sizes and heights of the same object moves from right to left, giving the illusion that the static character is moving from left to right. And also, by implementing changes in velocity and mouse pressing function, I think there is a way for me to create a simple game using the subtle design techniques of both of these projects with enough detail and compelling-ness for a suitable project.

project

I am interested in creating an interactive animation. I was inspired by the Patatap project where the user plays the key board as an instrument to also make an animation. I would like to use different sounds and create different characters for each key to create a chorus of made up animated characters. I am interested in making this an interactive animation using sound but I am also interested in making this more “game like”.I was thinking, if the user already pressed all the keys and reveals the entire chorus of characters, this could reveal phase two of the “game” or interactive animation where u can control the different noises or songs the chorus sings all or the dance moves they perform all together.

looking outwards

I am interested in an interactive project. I am inspired by the James Cameron Avatar Exhibition made in 2011 and Connected Worlds. The Avatar exhibition uses a interactive projection screen where visitors come into Pandora’s climate and socialize with the characters. There are luminescent floating jellyfish creatures that glide in the planets forest and glowing woodsprites appear when the audience move suddenly, making the experience feel real like a act and react interaction. Connected worlds is an interactive connected ecosystem by Design I/O in collaboration with the New York Hall of Science. It is a large scale immersive interactive ecosystem that is composed of six interactive ecosystems spread out throughout the hall of science. The viewers can use physical logs to move water from the waterfall flowing across the floor into different environments and can use their hands to plant seeds. Plants grow and creatures appear based off of the state of the environment (caused by the visitors) and this causes a chain reaction of behaviors.

 

http://design-io.com/projects/ConnectedWorlds/

https://www.snibbe.com/education-entertainment#/james-cameron-avatar-exhibition/

Hannah Cai—Looking Outwards—12

I was inspired by these projects:


For my project, I was originally thinking of just doing something like the first video, which would basically just be integrating sound into my generative landscape. However, I wanted to something with generative audio, and not just a simple audiovisual, like the glowing dot, which I personally don’t think is very exciting. I then thought of doing something with particles. Ideally, my end product’s visuals would be something like the second video, with particles that feel dynamic and “alive”. However, instead of just being static recorded segments, I want my project to be interactive, reacting to mouse position or something like that. I also want the user to be able to interact with the view/perspective, so I’m thinking about using WEBGL and orbit control.

Alexandra Kaplan – Project 12 – Proposal

For my final project, I am planning on making an interactive tv set. It will have with pre-loaded animations, images, and gifs that can be flipped through by pressing channel changer on the tv. I think it could also be cool if the animations were different depending on the time of day, much like a real television programming. I would separate the day into 4 different parts of 6 hours each for the different animations to play. I also want to incorporate an on/off button to turn ‘off’ the tv. If I have time, I would love to incorporate the camera in the computer to show people’s faces “reflected”on the tv screen when it is off like with television sets with glass screens. Also if I have time, it would be fun to have different sounds for the different buttons on the tv.

Project Proposal Sketch