Test Post

Here is a simple program.

sketchDownload
function setup() {
    createCanvas(300, 300);
    background(220);
//    text("p5.js vers 0.9.0 test.", 10, 15);
}

function draw() {
	if (mouseX < width / 2) {    // left side
		if (mouseY < height / 2) {     // top half
			background(255, 0, 0);
		}
		else {   // bottom half
			background(0, 255, 0);
		}
	}
	else {   // right side
		if (mouseY < height / 2) {  // top half
			background(0, 0, 255);
		}
		else {	// bottom half
			background(0, 0, 0);
		}
	}
}

“Masquerade”

Before taking a 15-104 course at Carnegie Mellon University as a first year student, I knew about one interactive and computational project – “Masquerade”, or better known online as MSQRD. MSQRD was once one of the most famous and most demandable mobile apps, it served as a foundation for
many face filters that people so often use while being on FaceTime or while taking a selfie or video. This app was fully developed by a Belarusian company Masquerade Technologies. My family is from Belarus, I myself was born and raised in that small but beautiful country and to think that such technology as face filters was developed there is simply astonishing since not everyone even knows about the existence of Belarus. What I admire about this project is the creators – Belarus has a very complicated and expensive educational system, many people can’t pay for it and/or can’t keep up with academics thus the country has a lot of people working in the physical labor force rather than in the offices. However, the creators of MSQRD were ambitious young men who wanted to get a better life. In the year of 2015, the company participated in “Garage48 Minsk Hackathon” and it took only a bit over 48 hours to develop MSQRD for iOS! Key creators of this project were Sergey Gonchar, Eugene Zatepyakin and Eugene Nevgen; as of January 2016, the team consisted of 11 members in total. Even though the team was rather small, it did not stop it from taking over the App Store – in February of 2016 it became the 9th most popular app and in March of the same year it made a purchasing deal with Facebook.

This project of face filters included both artificial intelligence and computer vision which pointed to many opportunities and futures since it added to various technologies, for example FaceID on iPhones that was introduced in 2017 as it helped to recognize a person’s face. MSQRD is a great app – it inspires people to create, to develop, to code no matter what your background is or where you are from: coding is available for everyone!

Project 1: Self Portrait

wpf-portrait
function setup() {
    createCanvas(400, 600);
    background(0,186,247);
}

function draw() {

    fill(70,44,26);
    arc(200, 150, 200, 200, PI, TWO_PI,);

    fill(70,44,26);
    arc(200, 125, 200, 200, PI, TWO_PI,);

    fill(233,168,139);
    ellipse(200,150,175,175);

    stroke(70,44,26);
    fill(70,44,26);
    rect(125,60,150,30);

    fill(70,44,26);
    rect(225,120,22.5,5);

    fill(70,44,26);
    rect(152,120,22.5,5);

    fill(255,255,255);
    ellipse(165,140,17.5,17.5);

    fill(255,255,255);
    ellipse(235,140,17.5,17.5);

    fill(35,163,102);
    ellipse(235,140,10,10);

    fill(35,163,102);
    ellipse(165,140,10,10);

    fill(0);
    ellipse(165,140,5,5);

    fill(0);
    ellipse(235,140,5,5);

    fill(239,139,129);
    arc(200, 200, 40, 40, TWO_PI, PI);

    fill(0);
    arc(200, 202.5, 30, 30, TWO_PI, PI);

    fill(241,215,212);
    arc(200, 202.5, 10, 10, TWO_PI, PI);

    fill(241,215,212);
    arc(210, 202.5, 10, 10, TWO_PI, PI);

    fill(241,215,212);
    arc(190, 202.5, 10, 10, TWO_PI, PI);

    fill(255,175,145);
    arc(200, 175, 15, 30, PI, TWO_PI,);

    fill(0);
    arc(197.5, 175, 4, 4, PI, TWO_PI,);

    fill(0);
    arc(202.5, 175, 4, 4, PI, TWO_PI,);

    fill(46,56,66)
    arc(200, 637.5, 200, 800, PI, TWO_PI,);


    


    
    

}

I found this self-portrait very challenging, particularly creating hair that looked even remotely human, but I had a lot of fun trying to draw myself.

LO: My Inspiration

The Interactive and Computational Piece of Art that inspires me is the Video Game Fallout: New Vegas. 

A Screenshot featuring Dinky the Dinosaur that is the main attraction of a location in the Game called Novac

The Video Game was created by Obsidian Entertainment, with Joshua Sawyer as Game Director, Inon Zur as Game Composer, and John Gonzalez as Head Writer among about 100 other developers.  The Game’s Development team had 18 months and used the common Gamebryo Engine (out of house engine which they also added code to) to create the game. 

The Development Team also took a lot of story elements from Project Van Buren (which was meant to be another Fallout game before the studio that made it, Black Isle closed down) and the series staple of 1950s Cold War aesthetics.

It released to favorable reviews and had a myriad of bugs.  However, after the bug fixes and DLC came out, the game slowly became a favorite among the Gaming Community. 

I consider it to be the best game in the Fallout Franchise, the best RPG ever made, and my favorite video game. 

I find it inspirational because it shows how open ended you can be when it comes to making Computational and Interactive Pieces of Art. 

This can be seen in the choices it gives you when you participate in the Story and Gameplay. 

For example, I can be a total pacifist in the game, never hurting any Non-Playable-Character, but I can also pick choices that results in the worst possible ending for everyone.  I can do the reverse of this and everything in between. 

I admire that it takes the Interactivity that is allotted to it by its nature of being Computational Art to the very extreme.  The game itself has also inspired me to write and create stories of my own – and potentially create Video Games/Interactive and Computational pieces of Art like it. 

As for future possibilities, the Game was successful so other Fallout games have been made.  Furthermore, the modding community of this specific Fallout Title is very active. 

Finally, here are the links to the Game’s Wikipedia Pages – one is Wikipedia Proper and the other is written by Fans of the Game.

Project 1: My Self Portrait

function setup() {
createCanvas(600,600);
background(220);
}

function draw() {
noStroke()
fill(125,86,41)
circle(100,400,200)
circle(500,400,200)
circle(120,300,200)
circle(480,300,200)
circle(180,200,200)
circle(420,200,200)
fill(247,228,205)
ellipse(300,300,300,400)
fill(125,86,41)
circle(250,160,200)
circle(350,160,200)
ellipse(150,300,100,200)
ellipse(440,300,100,200)
circle(200,250,100)
circle(400,250,100)
fill(0,0,0)
ellipse(225,350,25,50)
ellipse(375,350,25,50)
circle(300,400,100)
fill(247,228,205)
square(250,300,100)
}

I’m really impressed with some of the portraits you guys made. Mine looks so lame in comparison lol

LO: My inspiration

The project I have chosen was created by a fellow cmu student a few years ago. She composed a musical piece that was performed by an ensemble, but the composition was not like a typical music piece. Her medium involved a tub of water and multiple colors of ink. The ink was dropped by the composer into the water and projected onto a screen for the ensemble to read. The musicians used the projected image as their score, improvising tone and dynamics to match what they saw. I admire this project for the unique use of improvisation and unconventional style of music notation. Throughout history, artists have tried to notate their scores in exquisite detail so as to allow for replication. I like how this art piece goes in the complete opposite direction; it would be impossible to replicate the performance, no matter how clear the directions are given. The project was just one part of a larger art exhibit put together by a few cmu classes with the guidance of their professors. Creating this project required an ensemble of musicians, and the unique projection setup for the ink and water. The goal of the project was to think outside the box of traditional music performance and composition, and I believe the project achieved this goal. The experimental nature of this piece is exciting and visually appealing, adding a new layer to the music.

website: Subsurface: Site-Specific Sight and Sound 2018, Shambhavi Mishra

Subsurface: Site Specific Sight and Sound, 2018. Shambhavi uses colorful ink and water as her musical composition and notation.


LO: My Inspiration

Genshin Impact is an incredibly popular RPG that can be played for free on your phone. I love that I can explore the large and beautiful environments even on mobile. Often, video games of this scale are not available on a phone because it doesn’t have the processing power or there isn’t a good market. Since I do not own many consoles and my laptop is not good at running games, I love that I am able to experience a detailed game like Genshin. I know that Genshin Impact is developed by a fairly large team, and they release a new update every 6 weeks. These updates involve new playable characters, new places to explore, and frequent events to earn rare items. The Genshin team works hard to produce updates on time and without bugs, which I really appreciate. I have no clue what software is used for Genshin, but it’s something they’ve been able to optimize for multiple platforms, which is super cool. Many speculate that Genshin Impact was inspired by Breath of the Wild. Both possess a similar stamina mechanism for climbing cliffs and gliders to travel through the air. I hope that more high quality games are released for mobile so then I can play them 🙂
https://genshin.mihoyo.com/en/home

Genshin Impact anniversary 2021: Date, rewards, gifts, banner, events | ONE  Esports

Looking Outwards – 01

I am focusing on the work of Hailei Wang, a coder who makes painting using Python. What I admire most about Wang’s art is how she incorporates randomness into her code to create sets of work that are unique but are the same style and color palette, which makes really nice collections of works. Wang does the art as a side project for when she is not working. She does it alone and uses regular Python code. Wang was inspired by previous algorithmic art, specifically the Cornu Curve, which creates wave patterns. Wang just does this as a passion project, but sees the potential in generative art to create works of sculpture, fashion, and architecture.

Hailei Wang’s work

LO-1

A piece of computational art that inspired me to explore the intersection of art and technology is John Maeda’s “Reactive Books.” These ‘books’, which we published over the course of several years in the 90s, are computer software that each react to a different input type. The first one he published, “Reactive Squares”, is several different black squares that react in different ways to live input from the microphone. I really like the idea of playing with the computer’s ‘senses’ to output media that we can perceive with different senses. As far as know, Maeda was the only one working on the original books. Meada developed custom software for each of the books. After studying computer science at MIT, Maeda went to study at Tsukuba University Institute of Art and Design in Japan, where he studied and experimented with traditional bookmaking using his knowledge of software engineering. This marriage of very traditional techniques and styles, and modern, responsive software engineering results in really intriguing and thought-provoking pieces.

John Maeda, Reactive Books, 1994-1999

LO: Animation

I’m really interested in animation so I’ve been really inspired by shows like Love Death + Robots for their many different styles of animation. Each episode consists of a new story as well as a completely new animation style. This aspect is intriguing as I think it’s more unique and difficult to create as they can’t reuse characters and digital assets for multiple episodes, unlike a show like Family Guy that has been going on for over a decade. Each episode is actually created by different crews from many different countries. There are 26 episodes and thus 26 different teams working to create their own unique product. I’m not sure exactly how long the episodes took to make but they are pretty short, usually under 20 minutes, so I think each episode was maybe a month or two in animation, after the script was finalized. I think a majority of the teams used an “off-the-shelf” software to create their episodes but also groups that had a more specific animation style might’ve had to create and maybe adapt a software to work better on them. Each creator team has inspiration from their own background and experiences that really shine through when the episodes are put side by side. I don’t think that the show is planning to go further but I’m sure each team has been excited to make the episodes as they’re short and can be really well thought out.

Episode from Love Death + Robots. The characters work in their environments really well and do a good job animating them into reality.

Link: Creators: Tim Miller, David Fincher