Christine Seo – Looking Outwards 12

I was very inspired by the ideas of Real Slow, a project that visualizes music, as well as AV Clash, a project that involves audiovisual compositions through audio and sound effects. Throughout the course, my favorite parts were when we had to incorporate the camera and when we had to allow music in our assignments. Thus, I decided to look for inspirations that interacts with these two topics. First, these both have very interesting visuals that react to sounds and music. For Real Slow, there is face detection that allows the components of the face to react to the music and volume, which I found was very fascinating. This interactive project was originally inspired by the music experience for an Australian indie-electronic band, “Miami Horror”. The artist wanted to create a program to fit the mood & tone of their electro music “Real Slow”. As I researched, I found out that the first prototype was developed on OpenFrameworks and then implemented the idea on the web using JavaScript libraries for face tracking called ClmTrackr and p5.js for creating sound visualization. For AV Clash, I though that it was interesting how there is interaction between the objects that responded to the sounds. The project allows the creation of audiovisual compositions, consisting of combinations of sound and audio-reactive animation loops. Objects can be dragged and thrown, creating interactions (clashes) between them. Both of these projects not only have some sort of interaction with music and the program, but also an interaction with the audience as well, whether it is through face detection, or mouse detection.

Caption: Video documentation of Real Slow (2015), a Face Sound Visualization with music by Nithi Prasanpanich

http://prasanpanich.com/2016/01/01/real-slow/

Caption: Video representation of AV Clash (2010), a project by Video Jack, that creates audiovisual compositions, consisting of combinations of sound and audio-reactive animation loops

https://www.creativeapplications.net/flash/av-clash-flash-webapp-sound/

 

Jessica Timczyk – Looking Outwards 12

A screenshot from the 3D interactive doodle
A screenshot of the first shot of Solace, an interactive animation.

In preparation for making my own final project, which will be an interactive animation, I researched some other interactive animations and found these two very interesting projects. The first project is a VR and 3D interactive animated doodle by Fx Goby and Google Doodle artist Hélène Leroux made in 2018, in tribute to filmmaker Georges Méliès called ‘Back to the Moon‘. The doodle allows the viewer to move around and look at the animated environment in 360 degrees. The second project, called ‘Solace’ is also an interactive film created in 2017 by Evan Boehm and Jeff Noon in collaboration with Nexus Interactive Studios. The project takes the viewer through a narrated story in which the viewer can interact with almost all of the shapes and pictures making up the story with their mouse. Unlike the first interactive animation, ‘Solace‘ allows the viewer to actually interact with the images and shapes that make up the characters and etc of the story, whereas ‘Back to the Moon’ only allows the viewer to move around in the frame. Though I really enjoy the interactivity of ‘Solace’, I think I personally am partial to the 3D environment of ‘Back to the Moon’ and also the story line that accompanies it. I think that ‘Back to the Moon’, though very innovative in itself, missed an opportunity to make the actual story line and images interactive. Overall, I very much enjoyed both. You can play with both of them in the link or video below.

 

Week 12 – Project Proposal Sara Frankel

For my project, I would like to incorporate a game with music. I have always loved projects that interact with music. I would be lying if I said I know exactly how my project will look as of right now, but I am thinking of using some sort of animation that is coordinated with a soundtrack. For instance, color tones match the “mood” of the piece and the shape represent the different musical aspects of it. On the other hand, I want to take advantage of the user’s keyboard so that they can play a part of the visual experience of sound. I attached an image of my sketch that I am envisioning. As you can see below, my plan is to have this abstract image most correlated with the piece in both color and shape, this shape will change in shape according to the frequencies emitting from the song. Scattered around this main shape will be simpler shapes that “pulse” and rotate according to different aspects of the piece. The user will be able to play around with all of these shapes and change the song and aspects of it on the color chosen by the user. This project connects color, sound, and shapes to display music.

Kevin Thies – Project Proposal

An example room with enemies

My proposal for a final project would be a top-down view dungeon crawler game. I found that I was interested in using keyboard inputs as controls, and the natural development from there is a game.

The white circle is a stand-in for the character, and the red circle is where the mouse is aiming

Players would be a wizard, armed with a staff firing magic missiles (I think turtles would be good for a meandering projectile) going room to room vanquishing skeletons. I think rooms would all be square, but could spawn with doors. I could possibly make a system to make rooms and store their data, and start with a random numbers of doors that would decrease until rooms could no longer spawn with doors linking to new rooms. In this last room would be something that would be the goal, possibly a pile of treasure. If time becomes an issue, then the scope of the project could shrink to become a single room with waves of enemies, the goal being to last as long as you can.

The main inspiration was a Scratch project a friend and I built in our early high school days.

Looking Outwards – 12

The Space Between Us, Santa Monica, CA. 2013
The Chronarium Sleep Lab, The Cathay, Singapore. 2015

The project The Space Between Us, is by Janet Echelman and the project The Chronarium, is by Rachel Wingfield. Both of these projects are similar in the way that they are both human centered and designed for interaction and to create an experience throughout. I found these projects interesting because the approaches to similar concepts are different. Echelman’s project is situated in an open environment, while Wingfield’s project is a part of an enclosed space. While both are different in the physical aspect, both project incorporates lighting and sound to create an immersive audiovisual experience for their audience. Both project also include some sort of physical change to the environment by the audience. In the Space Between Us, the audience had to carve or make indentations through the sand so that they could sit comfortably to look up at the aerial sculpture. In the Chronarium, the audience would lie inside a textile canopy, which would change the shape and form of the envelope as they moved/turned to find the most comfortable position to rest/sleep.

John Legelis – Final Project Proposal

The goal for my final project is to make an oscillator/synthesizer with a record and playback function. The user interface will resemble the following sketch.

The components of this software are a keyboard as well as a drum machine. I will aim to automatically quantize the rhythms in order to give the playback a better sound and buff out difficulty with keyboard delay etc.

The keyboard functionality will be linked to actual keys on the computer of the user and the pitch bend function will be actuated by the mouse. The drum pad will be actuated in a similar way to the keyboard with the number keys. I am a bit apprehensive about all the levels of interaction the user is intended to have in the end in terms of clickable buttons etc. I hope this product will be fully functional as a fun musical instrument.

Joanne Lee – Final Project Proposal

I want to create an educational Rube Goldberg game geared towards children ages 5-9. I think that Rube Goldberg games are very helpful in learning cause and effect. I would like to use my virtual Rube Goldberg machine to complete ‘green’ tasks such as watering a plant, turning off the lights, recycling, etc. The purpose of the game is to teach children to understand cause and effect better and also understand the impact we can make when we go green!

In terms of functionality, I’ll have a virtual Rube Goldberg machine with certain parts that are broken. Through trial and error, they will have to pass three stages — each stage dedicated to a green task. I’m still juggling between what tasks I want to do. Once they pass each level, they’ll be shown a blurb about the environmental impact that the stage’s task makes on the environment. Below is a rough mock-up of what a level could look like.

In the actual one, I will include images to replace some aspects. I just used rectangles and circles to roughly represent a stage.

Jamie Dorst Looking Outward 12

For this week’s looking outward post, I found two different projects that have inspired my final project. The first one is a travel planner by Stamen Design that helps you plan a road trip, but also tells you what the weather is like along the way so you can plan better. After making your plan, you can drag around the stops on your trip and see how the weather changes if you take a different route. I think this is a good way to interact with weather and make the basic numbers more understandable and approachable.

An image of the trip showing the weather along the way
The other project I found was My Daily Color Palette by Jacobo Zanella. He made an image every day for the entirety of 2010 showing the color palette of his clothes and how much skin was showing. I thought this was a really cool project that would let you see patterns in what you wear, and see as the year goes by how that changes.
These are all of his color palettes for the month of May in 2010

Jamie Dorst Project Proposal

For this project, I had the idea to create some sort of weather app, but in addition to telling the forecast, it will suggest what to wear based on inputs you’ve given in the past. I’m not totally sure if this project will work, because it seems pretty complicated, involving some sort of basic machine learning and a weather API, but I’m interested to see if I could get it working on some basic level. I know this is something I’d love to have, and even though my final product may not be perfect, I think it’ll teach me a lot. The basic things I’ll need are a way for the user to input data, including what they wore and if it was good, a way to store, retrieve, and display that data later, and a forecast display.

A simple image of what I imagine my project looking like.

Justin Yook – Project Proposal

General idea of project

In the world of dance, concepts and formations are very important because they heavily influence how detailed choreography looks and feels as a whole. But, this part of the choreography process can be difficult because there are many factors involved. My final project is called Stage Master, a tool that choreographers can use to make various formations and staging ideas in an organized manner. Specifically, users will first drag and drop an audio file into the program. Then, the program will break down the audio into sets of eight counts after computing the BPM or tempo. Next, users can add as many dancers on the canvas by clicking the mouse, and move each dancer around by dragging them with the mouse. Every formation will be linked to a corresponding frame of counts within a set of eight counts. The convenient feature is that users can play the music while they arrange dancers. After finishing all the staging ideas, the program has the ability to export all screenshots of the formations in a compiled pdf file.