Sarah Choi – Project 12 – Proposal

Gretchen Kupferschmid and I want to collaborate on an interactive 3D map of different areas of Pittsburgh including ambient noises showing different restaurants and coffee shops around. Similar to the photograph included below, we wanted to show contrasting places in Pittsburgh such as Lawrenceville, Squirrel Hill, Oakland, and Downtown Pittsburgh. Through an interactive map, we would allow our audience to click through these sections to find not only various places to eat and spend time but also include music that we feel is similar to the atmosphere of these various sectors all throughout Pittsburgh. 

We both strongly believe it’s important to explore more areas around you especially since Pittsburgh has so much to offer, and a lot of students go through their time here without putting themselves outside the campus. 

In order to evenly divide the work one of us would focus on actually building the 3D map of different areas of Pittsburgh while the other would work on making the map actually interactive and inputting these different areas across the city and towns as well as the music that follows along with them.

Ammar Hassonjee – Project 12 – Final Project Proposal

For the final project, I plan to collaborate with Lee Chu to make a series of interactive camera scenes that use the webcam as a background, similar to the concept of snapchat and instagram filters. Our idea is that in each state of the canvas, our program will be able to identify the person’s body or head, and therefore start to draw and animate objects that move relative to the person’s body with some level of randomness. We imagine for one scene, the person can repel an object that moves around the screen with their hand. Another could just be simply drawing objects on top of certain parts of the picture using color analysis. The idea is that the user will be able to cycle through different scenes. Through this exploration, we hope to create a fun and interesting program that users can interact with for a while without getting bored.

Possible sketch of one scene animated on top of a webcam.

Sarah Choi – Looking Outwards – 12

One project, found online, was an interactive map by Sara B. It contained offices, a countryside, playground, house, and industries. This map was coded to be a 3D model coming out of a tablet. This was coded through a studio using images, shapes, and mouse functions. I admired this project because it gave the audience a better understanding of the layout of a certain area showing all different types of places.

https://codepen.io/aomyers/full/LWOwpR

Furthermore, another project I looked at was a hotel’s website in Lower Manhattan of New York City called, “Sister City”. This website used ideas of simplicity, purposefulness, and mindful design powered by Microsoft’s Custom Vision Service. This helped analyzed elements of the Sister City environment just from a camera on the roof. With artificial intelligence, the system recognized aspects of nature such as clouds and birds and triggered specific sounds to the installation. These sounds were then generated and installed in the lobby of the hotel creating a more natural environment to space in a very new setting. The ambient music formed an ambiance of the hotel, creating a more relaxed atmosphere overall. This was a very interesting take and immediately made me mesmerized. 

https://sistercitynyc.com/

The two projects together are so different but creative at the same time, which are what drew me from the very beginning. Both ideas pertain to my final project in the sense that Gretchen Kupferschmid and I want to design a 3D interactive map of Pittsburgh with ambient sounds showing our favorite restaurants and little shops we love visiting when looking for reasons to get outside the Carnegie Mellon “bubble”.

lee chu – project 12 proposal

I plan to collaborate with Ammar Hassonjee on the final project. We are thinking about creating a webcam overlay that could detect faces and project a filter over it, much like snapchat or instagram filters. Ideally, by detecting the colors in the frame of the webcam, we can identify and track a face. While tracking the face, sounds may be triggered by a specific action, and the entire filter could be dynamic depending on the user’s movements and actions. We will most definitely need to look into how face tracking generally works, as well.

Mari Kubota- Looking Outwards- 12

For my final project I am creating a climate change related game. One project I found that was relevant to this is Solar Tapestry by Chloe Uden. Solar tapestry is an artful way of arranging solar panels in order to create an aesthetic and ecological form of energy. Chloe Uden describes this as “the Art and Energy collective re-imagines solar technology as an art material for the future.”

Another project that related to my final project is Starlit Stratus. Starlit Stratus, by Sunggi Park, is the first place winner of LAGI 2019 Abu Dhabi competition. It is a large-scale public art capable of producing clean energy. The canopy is made with inspired by origami. Sections of the triangular geometry are made from conventional rigid photovoltaic material to produce clean electricity during the day, while other sections are made from fabric that can easily fold and unfold. At night the light passes through these fabric geometries to create star-like patterns. 

Both of these artworks use solar panels in creative and artistic ways to produce clean energy that is good for the environment. This aesthetic arrangement of clean energy use encourages more people to become more ecological. 

Monica Chang – Project 12 – Final Project Proposal

For this final project, I wanted to develop a game called “Capture Red”. In this game, the player or virtual “photographer” will be responsible for capturing birds with the virtual camera by pressing the spacebar. The problem is the birds fly by way too fast but the players have to capture as many as 100 “photos. Birds of different colors will also appear and if the player accidentally captures they will lose points.

My sketch of default screen

In the default/beginning screen, there will include a frame that resembles what the player would see looking through the lens but in a more exaggerated way. I will also include field audio to incorporate the sounds of the environment. The lens will be looking towards the sky where the player will also see moving clouds(I am also thinking of changing the sky throughout the game with different weather patterns and different weather sounds like wind, rain, thunder, etc.). UPDATE: I have also incorporated rain as a weather element and default screen and rainSounds as the background noise. When the player clicks the screen, the weather will change from raining(gloomy) to blue skies(with clouds).

My sketch of when bird Appears!
When the player/photographer takes captures the bird.

Whenever the player “captures” a bird, the screen will flash and a camera flash sound will also play. When the player “captures” the bird, in the top corner, they will see that they have gained a point.

The player gains a point!

Once the player has captured enough to have a “CAPTURED” count of 100 the game will end and the player must refresh to start again.

Jasmine Lee – Project 12 – Proposal

For my final project, I’d like to do an interactive audiovisual animation. The cursor would be followed by flowing graphic lines that change color based on what nodes have been activated and what sounds are playing. The intention is to create an animation that users can interact with to relax. The nodes are created by clicking anywhere on the canvas. I envision the node to be “invisible” (maybe a white circle on a white background) with the same flowing colorful lines circling around it, to create a more stylistic approach. When there are multiple nodes activated, the nodes become connected, with the lines starting to weave around all of them instead of just one node. The music that is playing would be various soothing sounds, ranging from white noise, to rain, to fire crackling.

The canvas without any nodes activated.
The canvas with multiple nodes activated and sound playing.

Paul Greenway – Project 12 – Proposal

This project will be a collaboration between Shariq Shah and I. We plan to create a dynamic data visualization to represent the patterns of weather and their effects on landscapes. As of now, the weather effects we will be focusing on will be wind and water. These effects will be driven by properties based on real world data and will react to the topography image map. This map will be a gradient based height map of a topography and will potentially change when the user interacts with the visualization, subsequently shifting the patterns of the weather effects. We also plan to use color gradients and other properties such as movement speeds and direction to further convey the environmental effects. Some precedents we have been looking at while coming up with the design are the works of Refik Anadol, Earth by Cameron Beccario, and Drawing Water by David Wicks.

Sean Meng-Project 12-Proposal

For my final project, I am interested in making a sound visualization. In order to transfer the invisible sound into something visible. I intend to incorporate the project with a series of designed visual elements. By creating a scene based on artist Kanye West’s album cover <808s & Heartbreaks>, the project recomposes the visual features of the album art and display it based on different sound frequency and amplitude. (e.g: the heart shatters as the song goes)

Jai Sawkar – Project 12: Proposal

This project will be done by Taisei Manheim, Carly Sacco, and myself.  Our project proposal is a smartphone and on the screen, there would be different apps.  You could click on the apps and they would each do different things. One of the apps would resemble snapchat and allow users to put a few filters on a picture of themselves (using the computer camera).  Another app would resemble a music app and allow user to choose different songs to play. There would also be an app that would resemble Instagram and allow users to scroll through images and like certain photos.  The last app would be the clock app which would allow users to see the time. We are thinking of working on the Snapchat app together, and then each of us working on one of the three remaining apps.

Sketch of our Idea
Initial Code Output of our Idea

Sketch

// Jai Sawkar
// jsawkar@andrew.cmu.edu
// Section C
// Mock Up


function preload() {
    var snapURL = "https:i.imgur.com/Mc39iWj.png?" //picture of 4 people
    snapPic = loadImage(snapURL); //sets fam pic as the variable
    var musicURL = "https://i.imgur.com/m6NxUGy.png?"
    musicPic = loadImage(musicURL);
    var instaURL = "https://i.imgur.com/qTYtnyQ.png?"
    instaPic = loadImage(instaURL);
    var clockURL = "https://i.imgur.com/eX2G9P3.png?"
    clockBG = loadImage(clockURL);

}

function setup() {
    createCanvas(640, 400);
}

function draw() {
    background(200);

    //PHONE

    push()
    rectMode(CENTER);

    fill(60, 59, 59);
    noStroke();
    rect(width/2, height/2, 159, 323, 15)

    fill(173, 177, 218);
    rect(width/2, height/2, 148, 308, 15);

    fill(60, 59, 59);
    rect(width/2, 48, 81, 11, 10);//notch

    fill(61, 60, 60);
    //fill('WHITE')
    //ringer
    rect(239.5, 94, 1.6, 11.9);
    //Volume Up
    rect(239.5, 123, 1.6, 23.75);
    //Volume Down
    rect(239.5, 149, 1.6, 23.75);

    //Lock
    rect(400.5, 137, 1.6, 38)

    fill(88, 89, 91); //Notch Hardware (Left to Right)

    ellipseMode(CENTER);
    ellipse(289, 48, 5);
    ellipse(301, 48, 5);

    rect(width/2, 48, 19, 3.56, 40);

    ellipse(340, 48, 5);

    pop();

    makeSnapchat();

    makeMusic();

    makeInsta();

    makeClock();


} 

function makeSnapchat(){
    push();
    
    translate(width/2 - 60, 85)
    scale(0.25);
    image(snapPic, 0, 0)

    pop();
}

function makeMusic(){
    push();
    
    translate(width/2 + 6, 85)
    scale(0.25);
    image(musicPic, 0, 0)

    pop();
}

function makeInsta(){
    push();
    
    translate(width/2 - 60, 160)
    scale(0.25);
    image(instaPic, 0, 0)

    pop();
}

function makeClock(){
    push();
    
    translate(width/2 + 6, 160)
    scale(0.25);
    image(clockBG, 0, 0)
    pop();

    push()
    translate(width/2 + 1.5, 155);
    scale(0.15);

    clockMove();





}

function clockMove(){
    angleMode(DEGREES);

    var h = hour();
    var m = minute();
    var s = second();

    push()
    noFill();


    translate(200, 200);
    rotate(-90);

    
    
    
    strokeWeight(5);
    var hStart = map(h % 12, 0, 12, -90, 360);
    arc(0, 0, 220, 220, 0, hStart)

    strokeWeight(4);
    var mStart = map(m, 0, 60, 0, 360);
    arc(0, 0, 240, 240, 0, mStart)

    stroke(0);
    strokeWeight(2);
    var sStart = map(s, 0, 60, 0, 360);
    arc(0, 0, 260, 260, 0, sStart)

    pop()
}