Lingfan Jiang – Looking Outwards 12

For this week’s looking outwards, I am going to write about two projects that I found inspiring for my final project.

“Inside” is one of my favorite games. It is a puzzle-platformer adventure game developed and published by Playdead in 2016. The player controls a boy in a dystopic world, solving environmental puzzles and avoiding death.

Image result for inside

The second project that I found interesting is called “Minicade” by Chloe Varelidi who is an indie game designer/ developer. (year unknown) It is a mobile web-app that makes it super easy way to create games with your friends while learning to code along the way. Each person can add a link to one or more games to a custom playlist and instantly play them as one massive game. Here are three examples. 

Comparing the two games I showed above, the most impressive aspect for “Inside” is that the game does not have any written clues that tell you how to play this game, but since the controls of the game are very easy, players are still able to enjoy the game. As for the “Minicade”, different from other Arcade Mode games, the players are able to customize the games themselves which let people learn while playing games.

For the shortcomings of “inside”, since there are no guidance systems in the game, players tend to depend on other people’s successful strategies if they are stuck in the game. It would potentially spoil the gaming experience. As for Minicade, since the number of games is still limited. Players might be tired of the games really easily.

Yiran Xuan – Project 12 – Proposal

My project idea is to recreate the schoolyard clapping game “Shotgun” in Javascript. The game consists of each rounds, within each round two (or more) players face each other and simultaneously play one of three moves: “Shoot”, “Reload”, and “Shield”.  Shielding blocks all incoming bullets, and can be used any round; reloading adds one bullet to the chamber, but leaves the player vulnerable; shooting fires a bullet toward the (or an) opponent, but can only be used if the player has at least one bullet in the chamber already. The game ends when one player plays shoot while their target is reloading. This game has a luck element but is also strategic, good players being able to predict when their opponents would be left vulnerable. Another important element to the game is clapping, which helps ensure simultaneous play; typically, players would clap their hands on their thighs twice before making their move, establishing a beat.

For this project, I intend to represent the two players as two animated dragons spitting fireballs at each other. Players would have 4 separate key each to play, 3 for the moves, and 1 for “clapping”. I will play a short sound regularly to establish a beat, and players would need to press the “clap” button within a certain time frame, or they suffer consequences (like a rhythm game). I would need counters to keep track of the refresh rate. I will also build a random move-generating AI. For most of the interface, I will draw outside of the program and import in as images.

 

Yoo Jin Shin-Project-12-Proposal

For the final project, I would like to create some sort of interactive display or animation for the home page of my personal website. I saw one in which the home screen included an interactive color wheel in which each color block led to a different project once clicked. Another included an animation that told a short story of the portfolio owner’s background/works. For the first idea, a lot would depend on the mouse positions and including images while the second idea would be some sort of generative landscape with mostly nonrandom elements.

Sketches

Sophie Chen – Project 12 – Proposal

For my final project, I would like to create animations that react to movement and gestures in front of a camera. I study projection design in the school of drama, so I’m really interested in possibilities between a performer and live reactive animations. I hope to create a prototype that can be possibly expanded to a larger scale or used for live performances/installations in the future. I’m not entirely sure if the final visual result will include the camera input of the users or consist of just the animations alone, but I think that’s something I will decide as the animations solidify. Since this project will be receiving its camera input through a webcam, I think it will rely on the viewer to be in front of a solid color background wearing solid color clothing. The actual animation content will be abstract and simple, and may be created through turtle graphics.

rough sketch

Alessandra Fleck-Looking Outwards-12

For my final project I want to engage in an augmented reality application that uses a webcam to change one’s background setting. One project that I found integrating p5js into an augmented reality platform was a project called Kalopsia.

The image above shows a picture of how flower animations are graphed onto a succulent.

Kalopsia utilizes similar scripts used for facial recognition with a webcam to project Japanese-inspired drawings. However, in this application, the AR becomes more a tool for sculpting and detailing.

The second project I looked at was created for a company called VariousWays, and utilizes ar.js to create the augmented reality effect. In the short video it can be seen how an artist who wants to hand out their business card, can integrate a brief example of their work without having to hand out a full portfolio.

The image above shows an object with the business card owners initials on it.

Overall, the purpose of both projects traces back to the desire to bring another dimension to how we perceive our surroundings. Where the project Kalopsia seeks to bring beauty to regular objects, whereas the augmented reality business cards seek to augment the impact of a card, by bringing interactivity to it. However, both projects seek to bring an interesting detail to an everyday object/object that might be perceived in 2D but with 3D notions, become far more expressive. One aspect that I think could be further integrated into both projects is some form of application or connection of the AR image to a social media platform or other users. That way the AR doesnt just work on one scale.

More Information:

Project Kalopsia (http://www.zeitgeistbot.com/blog/kalopsia-is-an-augmented-reality-interactive-generative-art-project/)

Business Card Augmented Reality (http://www.zeitgeistbot.com/blog/augmented-reality-business-card/)

Christine Chen-Project 12- Final Project Proposal

For my final project, I want to create a game that has something to do with mouse interactions. I will be creating a tree that constantly has objects falling from it. The objects are apples, golden apples, and bird poop. The task of the users is to catch the apples with the basket through controlling the mouse x position. If the users catch a apple, they will get 1 point. If the users catch a bird poop, the game will end. If the users catch a golden apple, they will get a bonus life that will save them from losing the game if they catch a bird poop later on. The background would be a generative one that has birds flying around. The color of the day will change according the the time when the user is playing the game. I am still currently thinking of other things to add to the game, such as having a naughty kid that comes in every now and then who steals the apple, or speeding up the velocity of the falling apple over time.

Kai Zhang-Looking Outwards-12

Image result for limbo

For this weeks Looking Outwards, I’m to show two examples of the platform games that’s going to be helpful as for some inspiration for the project. First of which is called Limbo, which is puzzle-platform video game developed by independent studio Playdead, available on both mobile and PC platforms. In the game, the player guides an unnamed boy through dangerous environments and traps as he searches for his sister. The developer built the game’s puzzles expecting the player to fail before finding the correct solution. Playdead called the style of play “trial and death”, and used gruesome imagery for the boy’s deaths to steer the player from unworkable solutions.

I really enjoyed playing this game, not only because its unique gameplay experience, but also the aesthetic choices of the scenes, characters and objects, that make the game really go well with its atmosphere. There’s very limited use of color, in fact, there nearly isn’t any other color than black and white in this game. All the objects are made up by silhouettes, which is very appealing to me.

Here is a YouTube video for the game trailer from the developer.

Image result for subway surfers

Another game is called Subway Surfers, an endless runner mobile game co-developed by Kiloo and SYBO Games, private companies based in Denmark, which is available on all mobile platforms and PC. Players take the role of young graffiti artists who, upon being caught in the act of applying graffiti to a metro railway site, run down railroad tracks to escape from an inspector and his dog. As they run, they grab gold coins out of the air while simultaneously dodging collisions with trains and other objects, and can also jump on top of the trains to evade capture. Special events, such as the Weekly Hunt, can result in in-game rewards and characters.

 

Here’s a gameplay video on YouTube.

From the gameplay, we can see the play navigate the runner using mouse positions, while in mobile game it’s using swiping gestures. As people develop games, they have to think about the best way people control their subjects – mouse, controller, keyboard, or their fingers. So I believe we should think about how we can use different kinds of interaction techniques to develop the game so it works best with the current platform.

Also the game comes with reward system, which also accompanies with sound effects, as they usually provide some more positive feedback as people are playing the game. Perhaps we can also incorporate that in the game we are going to develop.

 

Project 12-Project Proposal-Veronica

For the final project, I want to create an animation with a bit of interaction and sound effects. In my studio I am currently working on a thesis project dealing with the environment and cohabitation/negotiation of boundaries between human and more-than-human species, and creating installations for endangered species as well as synanthropic species to share urban space. I am inspired by the works of Joyce Hwang and Sarah Gunawan , and I want to create an animation that documents the vision of such installation projects and their impacts on our environment.

I have identified the green spaces along the MLK Jr. East busway in the Hill District, Pittsburgh, and I want to focus my animation on one scene and 3 different species of animals: bats, black throat warblers, and raccoons. The animation will show buses and trains pass by at a certain interval of time, and a key press will add an installation of an artificial “nest” for a certain type of animal. Initially the background sound will be city noises, and as the number of birds are added, the bird sound will get louder and city noises get quieter. Below is a rough sketch of my ideas.

 

Lingfan Jiang—Project 12—Proposal

For the final project, Kai Zhang and I are planning to build a web game that resembles something similar to the game that was popular in 2014 on mobile platforms -Flappy Bird. The only thing that the player needs to do is to tap the screen so that the bird flaps the wings and ascend to avoid falling or obstacles.

Image result for flappy bird

The idea of the game would be similar, but the style of the graphics would definitely be more original. Also,  additional to the original game, we are thinking about adding more operations into the game. For example, the bird is able to spit out bullets to destroy the cloest obstacle, but the bullets would be limited to a certain number.

Since we decided to do a collaborative project, one person might focus on the generative background and the scoring system. The other one might focus on the general movement of the bird, but more detailed ways of collaboration might be mentioned more in the next post.

Yingyang Zhou-LookingOutwards-12

I look specifically for the visual presentation of sound effect (audiovisual art) which I’ll design for my final project. I found many interesting art work related in generative art realm. One of them I apprecited is the work of  Leander Herzog.

For this project, sound will be automatically played but you can interact with it too, as you clicked, different rythm of same sound effect will be played, it is amazing because it makes you wondering the connection of the sound and visual effect.

https://leanderherzog.ch/bdakfgi/

https://leanderherzog.ch/rain/

This project is not related to sound but still interested to me because the strong effect of perpective changing as your mouse move, possibly similiar element could be applied to my final project.

https://leanderherzog.ch/shader/#

Another artist that interest me is Tina Frank.

http://www.tinafrank.net/audiovisual-art/vergence/

The description of this project:

“This 6,5-minute long video of Tina Frank focuses on the threshold of spatial perception. Like a chromographic pendulum yellow-black patterns contract, unfold and overlap. They evoke rapid speed mementos of Brion Gysin’s Dreamachines aswell as Tony Conrad’s The Flicker or of Gestalt Theory from the early 20th century. After an induction period of some minutes the viewer can no longer tell if what he sees are afterimages from the color space or if these psychedelic visions are part of the videosequence.

This experience is intensified by the four-channel-soundtrack from Florian Hecker. Dynamic pulsating rhythms bring narrative cartesian coordinates from front, back, left and right into a permanent oscillation. Binaural stereophonic and quadrophonic arrangements add up to an acoustic whole which consolidates a timebased déjà vu together with an acoustic déjà entendu.”

I’m interested in the layersof sound it display and yet the visual is simple but enough to show the relation of the sound.