mjnewman Project-04, Section A

sketch

//Meredith Newman
//Section A
//mjnewman@andrew.cmu.edu
//Project-04-LineArt

function setup() {
    createCanvas(400, 300);
    background(135, 12, 3);
}

function draw() {
	//variables used to vary spacing inbetween lines
	//for lines going from the top of canvas to right
	var x1StepSize = 10;
	var y1StepSize = 15;

	//for lines going from the right of the canvas to the bottom
	var x2StepSize = 24;
	var y2StepSize = 16;

	//for lines going from the bottom to the left of the canvas
	var x3StepSize = 22;
	var y3StepSize = 12;

	//for lines going from the left to the top
	var x4StepSize = 11;
	var y4StepSize = 12;

	for (var spacing = 0; spacing < 30; spacing ++) {
		//as mouse moves across width of canvas, grayscale changes
		stroke(map(mouseX, 0, width, 0, 255));

		//equation for lines that go from top to right of canvas
		line(x1StepSize * spacing, 0, width, y1StepSize * spacing);

		//equation for lines that go from right to the bottom
		line(width, y2StepSize * spacing, (x1StepSize * -spacing) + width / 2, height);

		//equation for lines that go from bottom to the left
		line(x3StepSize * spacing, height, 0, y3StepSize * spacing);

		//equation for lines that go from left to the top
		line(0, (y4StepSize * spacing) + width / 2, (x4StepSize * -spacing) + width, 0);
	};
}

After creating three of the “curves,” my code started to remind me of the opening credits by Saul Bass for Vertigo. So, I tried to place the fourth set of lines so that the curve echoed the eye that is synonymous with the Vertigo opening sequence. In addition, I set the background a darker red that also echoes the opening sequence. The sequence is embedded below:

 

mecha-lookingoutwards-04

Patatap was created by designer Jono Brandel in collaboration with Lullatone and published on March 26, 2014. Described as a “portable animation and sound kit”, patatap generates sounds created by the Lullatone team, Shawn and Yoshimi, with corresponding graphics at the push of a key. As both a graphic designer and a computer programmer, Jono wanted to create a program that introduced synesthesia–more specifically visual music–to creators.

What drew me to this project specifically was the way that sound and graphics were combined in such a precise manner. When first introduced to this website a few years ago, I did not even consider the fact that it had to have been coded. With the knowledge that I know now, I have even more respect for the designers and the project itself. I am also inspired by Jono’s ability to purposefully exemplify the concept of synesthesia through Patatap.

Bettina-Project04-SectionC

sketch

// yuchienc@andrew.cmu.edu
// Section C
// project 4--stringart

function setup() {
  createCanvas(400,300);
}

var bgColor='#f8ada3'
var fuschia='#d06870'
var lightBlue='#86b5c6'
var ochre='#d0b268'

function draw() {
  background(bgColor);
  fuschiaCurve(mouseX);
  lightBlueCurve(mouseX);
  ochreCurve(mouseX);
}

function fuschiaCurve(mouseX) {
  var x1=mouseX;
  var y1=0;
  var x2=10;
  var y2=300;
  var x3=250;
  var y3=0;
  var cmx=constrain(mouseX,75,500);
  for (i=0;i<400;i+=20) {
    stroke(fuschia);
    line(x1,y1,x2,y2);
    line(x2,y2,x3,y3);
    x1+=.5;
    x2+=10;
    x3*=.8;
  }
}

function lightBlueCurve(mouseX) {
  var x1=10;
  var y1=300;
  var x2=250;
  var y2=0;
  var x3=x2+20;
  var y3=300;
  var cmx=constrain(mouseX,100,500);
  for (i=0;i<300;i+=50) {
    stroke(lightBlue);
    line(x1,y1,x2,y2);
    line(x2,y2,x3,y3);
    x1+=20;
    x2+=cmx/25;
    x3=x2+20;
  }
}

function ochreCurve(mouseX) {
  var x1=150;
  var y1=300;
  var x2=300;
  var y2=0;
  var cmx=constrain(mouseX,100,500);
  for (i=0;i<260;i+=20) {
    stroke(ochre);
    line(x1,y1,x2,y2);
    x1+=cmx*.05;
    x2+=cmx*.05;
  }
}

My previous projects, I had sketched a clear visual and then figured out the code to execute it. This time, I wanted to embrace the nature of code and draw something directly in p5 without sketching ideas prior. I’d used the print statement to find the values for x and y at certain points in the loop so I could use those values for other curve functions. I spent the most time adjusting the increments to x and y so the composition outcome was balanced. Some process pictures are below.


I also added an interactive portion, building off of last week’s assignment, so the positions of certain coordinates varied with mouseX.

dchikows – Section C – Looking Outward – 04

Ryoji Ikeda is an extremely popular music composer in Japan. He focuses on the “essential characteristics of sound itself and that of visuals as light by means of both mathematical precision and mathematical aesthetics.” One of Ikeda’s works is an installation called The Transfinite. Accompanying the aggressive and static like electronic music are barcode like patterns that move throughout the 54 feet wide by 40 feet tall screen. I believe Ikeda created the music and visuals by the use of loops. He must have had to create different sounds then made them into functions and called them in loops. I am drawn to his work because it is so all encompassing. Just by seeing a video you can tell the world he creates people are a mere spec.

The Transfinite

Ryoji Ikeda’s website

The Transfinite

Ziningy1- Section C – Looking Outward 4

Magenta is a project that lead by the Google Brain Team to explore how machines with computational abilities can generate music. Magenta encompasses two goals. It’s first a research project to advance the state-of-the art in music, video, image and text generation. So much has been done with machine learning to understand content—for example speech recognition and translation. Different with other artists utilize computation to generate music, songs and musics of Magenta was solely generated by computer intelligence with machine learning models. The first creation song from Magenta is a simple melody that primed with four notes. The Magenta algorithm is given four notes (C,C,G,G) to begin with, and it come up with the original melody. I personally find it very impressive that how artificial intelligence with deep learning models can already accomplished some simply content generation and creativity, which is a stark contrast to the stereotypical perspective that machine will only be capable of non-creative/systematic tasks.

There is some demo from Magenta:

hdw – Looking Outwards 4

“77 Million Paintings” is a sound and art digital software by Brian Eno. This artwork was made by compiling Eno’s past 296 works into a generative code that not only combines up to 4 of them visually at a time, but also generatively pairs music with his artwork. The title was named after the different number of artworks that could be made with said code, 77 million. This work is supposed to highlight Eno’s work with experimenting with light and generative music. Eno also shows this work through art installations at vaarious museums around the world. His work was inspired by minimalist musicians such as Phillip Glass and Steve Reich.

https://vimeo.com/638631
Example of some of his works.

Brian Eno’s work comes in a CD format with 2 disks, the first containing a software of randomized music and images, the second of which containing an interview.

Randomly generated images of his code.

rgroves – Looking Outwards 04

In the summer of 2016, Icelandic ambient rock band Sigur Ros livestreamed a 24 hour journey around Route One, Iceland’s beautiful coastal ring road. The whole video was accompanied by new music from the band – well, new music in collaboration with a generative music program called BRONZE. BRONZE was created in 2011 by Mike Grierson, a scientist at Goldsmith University, and musician Gwilym Gold. It takes a recorded piece of music and is able to infinitely regenerate it in unique transfigurations. As the original recording is played over and over on this platform, it’s impossible to tell when each reiteration starts and ends because segments may be played in a different order, instruments may be amplified or eliminated, some sections are shortened and others elongated, etc. The music becomes ephemeral, as “the chances of hearing the same version of the track versus the chances of winning the lottery don’t even compare,” according to Gold.

This experiment worked extremely well with Sigur Ros’s ethereal sound. The entire spectacular 24 hour journey is available online, but here are the middle 9 hours!

And here is a link to an album by Gwilym Gold, which is only available on BRONZE and no permanently recorded version exists , so you truly can’t hear the music the same way twice. You do have to download a mobile app in order to listen.

http://bronzeformat.com/

Project 03 -Dynamic Drawing

sketch

function setup(){
    createCanvas(640,480);
}

function draw(){
    //as mouse moves to right, time in day advances in frame
    if(mouseX<=160){ 
        background(204,255,255); //early morning sky
    }else if(mouseX>160 & mouseX<320){
        background(153,255,255) //midday sky
    }else if(mouseX>320 && mouseX<480){
        background(0,0,153) //evening dusk
    }else if(mouseX>=480){
        background(64,64,64) //nighttime
    }

    noStroke();
    fill(0,153,0);
    rect(0,300,640,200) //grass
    fill(225);
    rect(175,125,300,200) //hunt library base
    fill(205)
    rect(175,125,300,15) //horizontal bars
    rect(175,175,300,15)
    rect(175,225,300,15)
    rect(175,275,300,15)
    

    if (mouseX<=480){
        fill(200)
        rect(175,125,20,200) //vertical bars
        rect(215,125,20,200)
        rect(255,125,20,200)
        rect(295,125,20,200)
        rect(335,125,20,200)
        rect(375,125,20,200)
        rect(415,125,20,200)
        rect(455,125,20,200)
    }else{
        fill(204,0,0) //red
        rect(175,125,20,200)
        fill(255,128,0) //orange
        rect(215,125,20,200)
        fill(255,255,0) //yellow
        rect(255,125,20,200)
        fill(0,255,0) //green
        rect(295,125,20,200)
        fill(0,0,255);
        rect(335,125,20,200)
        fill(0,255,255)//light blue
        rect(375,125,20,200)
        fill(102,0,204) //purple
        rect(415,125,20,200)
        fill(255,0,255) //pink
        rect(455,125,20,200)

    }

    fill(250)
    rect(300,250,50,75) //hunt door
    rect(287.5,250,75,5) //awning
    
    //this is the scottie doggo now

    fill(0)
    rect(341,375,150,55,25,25)//body
    ellipse(362,435,40,20) //left foot
    ellipse(470,435,40,20) //rightfoot
    rect(470,365,15,20,85,0,55,55) //tail
    ellipse(356,360,60,60) //head
    rect(315,365,50,25,5,5) //muzzle
    ellipse(330,365,30,20) //head muzzle connector
    rect(315,385,5,10,25,5,0,5) //beard
    rect(320,385,5,10,25,5,0,5)
    rect(325,385,5,10,25,5,0,5)
    rect(330,385,5,10,25,5,0,5)
    triangle(355,327.5,385,327.5,375,355) //ear
    fill(255)
    ellipse(340,350,10,10) //eye
    fill(0)
    ellipse(337.5,347.5,5,5) //pupil
    fill(215)
    ellipse(317.5,362.5,10,7) //nose

    fill(0,102,204)
    ellipse(200,435,80,25) //food bowl
    rect(160,420,80,20,25,25)


    if(mouseY<300){
        fill(255)
        push()
        rotate(radians(40)) //left bone
        rect(400,200,25,10,10,10,10,10)
        ellipse(400,200,10,10)
        ellipse(400,210,10,10)
        ellipse(425,200,10,10)
        ellipse(425,210,10,10)
        pop()
    }else{
//bone 2
        fill(255)
        push()
        rotate(radians(40)) //left bone
        rect(400,200,25,10,10,10,10,10)
        ellipse(400,200,10,10)
        ellipse(400,210,10,10)
        ellipse(425,200,10,10)
        ellipse(425,210,10,10)
        pop()
        rotate(radians(-30)) //right bone
        rect(-30,470,25,10,10,10,10,10)
        ellipse(-30,470,10,10)
        ellipse(-30,480,10,10)
        ellipse(-5,470,10,10)
        ellipse(-5,480,10,10)
        pop()
    }
}
    

For this project I decided to use let the mouse determine the daylight (as dragged to the right, it becomes night, as shown by the darkness and how Hunt light ups), and as the mouse goes closer to the food bowl, the scottie dog gets more food. My process first started by creating the simple drawing, and then adding if statements to allow the concept of daylight to change as well as the feeding of the dog.

aranders-lookingoutwards-04

Sonic Pendulum is a soundscape created by Yuri Suzuki Design Studio and QOSMO in 2017. The artificial intelligence part of Sound Pendulum utilizes the atmosphere around it to create sounds of harmony. These sounds are created using speakers and pendulums. The pendulums allow the doppler effect to help dictate the music. Every sound it creates is a response to what is around it, so the harmonies never repeat and continuously alter. I admire this project because of its interactive element and its mellifluous element. The environment can easily soothe a stressed person (I wish I could benefit from its atmosphere at this moment). A deep learning algorithm was used in this project that was trained with compositions and could be changed given the people and noises around it. The project embodies the artist’s ideas of order coming from chaos.

link