LookingOutwards-10

There isn’t specific computational music for this launchpad but I choose this is one because it is a new method(at least for me) to create music using technology. People can compose/edit a music using launchpad that had each own’s volume and pitch. Depending on how the user sets up the mode, it can be used as a bass or the main pitch. Even though this video is not creating original music, people can edit/make a chorus corresponding to the music. I think it is also fascinating how music can be linked to those notes and be edited/created right away by the composer.

LO-10

For this week’s looking outwards, I took a look at KNBC by Casey Reas. It uses news broadcast footage and translates them into a collage of pixels, which is then projected onto a wall. I found it a really interesting and meta look at the way we consume information so quickly and abundantly in our day-to-day lives. The work was done in Processing, for which Casey Reas is the co-founder of.

Interestingly, there is still a clear narrative that is being presented in the work, which I found really interesting; even when information has been distorted beyond what is cognitively recognisable, we can still see the beginning and end transitioning into another story altogether.

Overall, I really enjoyed the visual aesthetics of the piece, and how sound plays a large role in both its presentation but also how we come to interpret and understand information as a core piece of the artwork’s intent.

Project 10 – Sonic Story

My story features my cat and my dog exchanging noisy toys. I just thought it was hilarious to clip these pics of their heads out and animate them. I think the childish way everything was drawn/pieced together adds to the silliness of the story.

sketch
//TLOURIE
//SECTION D
//STORYLINE: GIRLCAT AND BRIDIE MEET AND EXCHANGE TOYS WITH ONE ANOTHER> THEN THEY PLAY WITH THE TOYS
var count = 0;

var girlcat;   //images
var bridie;
var maraca;
var ball;

var livingroom;

var squeak;    //sounds
var maracashake;
var meow;
var bark;


var bridieX = 520;
var bridieY = 200;

var ballX = 595;
var ballY = 290;

var maracaX = 150;
var maracaY = 300;

function preload() {
    // call loadImage() and loadSound() for all media files here
    girlcat = loadImage("https://i.imgur.com/Epqj4LE.png");
    bridie = loadImage("https://i.imgur.com/i0HfI2s.png?1");
    maraca = loadImage("https://i.imgur.com/tp9MlSK.png");
    ball = loadImage("https://i.imgur.com/YUkpldR.png");

    livingroom = loadImage("https://i.imgur.com/1omFhoF.jpg");


    squeak = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/squeak.mp3");
    maracashake = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/maracasingle.wav");
    meow = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/singlemeow.wav");
    bark = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/singlebark.wav");

}


function setup() {

    createCanvas(600, 400);

    useSound();
    frameRate(2);

}


function soundSetup() {
    squeak.setVolume(0.5);
    maracashake.setVolume(0.5);
    meow.setVolume(0.5);
    bark.setVolume(0.5);
    
}


function draw() {
    switch (count) {
        case 7: bark.play(); break;
        case 16: meow.play(); break;
        case 25: bark.play(); break;
        case 34: meow.play(); break;
        case 43: squeak.play(); break;
        case 47: maracashake.play(); break;
        case 48: maracashake.play(); break;
        
    }

    background(200);
    livingroom.resize(width, height);
    girlcat.resize(175, 175);
    bridie.resize(150, 150);
    maraca.resize(150, 75);
    ball.resize(75, 75);


    image(livingroom, 0, 0);

    image(girlcat, 70, 200);

 

    
    image(bridie, bridieX, bridieY);
    
    image(maraca, maracaX, maracaY);


    image(ball, ballX, ballY);



    if (count < 6){
        bridieX -= 25;
        ballX -=25;
        //bridieY += 20;

    }

    if (count > 6 & count < 15){  
        fill(255);
        noStroke();
        textSize(15);
        ellipse(300, 175, 300, 75);
        triangle(300, 200, 350, 200, 375, 250);
        fill(0);
        
        text('hey girlcat, is that my maraca?', 200, 175);
    }
    if (count > 15 & count < 24){
        fill(255);
        noStroke();
        textSize(15);
        ellipse(300, 175, 300, 75);
        triangle(270, 200, 330, 200, 225, 250);
        fill(0);
        
        text('yeah, it is. is that my squeaky ball?', 200, 175);
    }

    if (count > 24 & count < 33){
        fill(255);
        noStroke();
        textSize(15);
        ellipse(300, 175, 300, 75);
        triangle(300, 200, 350, 200, 375, 250);
        fill(0);
        
        text('it sure is. wanna trade?', 200, 175);
    }
    if (count > 33 & count < 37){
        fill(255);
        noStroke();
        textSize(15);
        ellipse(300, 175, 300, 75);
        triangle(270, 200, 330, 200, 225, 250);
        fill(0);
        
        text('alright', 200, 175);
    }
    if (count>=37 & count < 42){
        ballX -= 50;
        maracaX += 50;

    }
    if (count == 42) {
        ballY += 25;
    }
    if (count == 43) {
        ballY -=25;
    }
    if (count == 46) {
        maracaY += 25;
    }
    if (count == 47) {
        maracaY -=25;
    }







    count ++;

}






























LO – 10

Pazhutan Ateliers is a computational music education and production project by duo M. Pazhutan and H. Haq Pazhutan. The course topics listed on the website include (but are not limited to) electronic/computational music, music appreciation, and sound art.

The particular project I looked at was “Cy-Ens,” short for cybernetic ensemble. To quote the project page, “Cy-Ens is our computer music project with the ambition of expanding the potentials of understanding the aesthetics of computational sound and appreciation of logic, math and art.” The album consists of 15 to 30 minute tracks of ambient computer generated noise. The creation of the work involved the use of open-sourced audio and programming languages, as well as various physical MIDI controllers such as knobs, sliders, and percussion pads. The concept of the project is to create abstract sound compositions by allowing them to arise from mathematical patterns.

Project 10: Visual Story Home Before The Storm

For my sonic visual story, my goal was to create a scene with simple shapes I was familiar with and I could control with my code easily and create a story with image and sound.

Visuals

I did some research online, about how artists created different scenes in P5.js (not with sounds, but only with visuals), so I could start thinking about the sequence of my story. I create simple sketches and planned which parts were non-moving and which were movable.

That helps me to categorise my code into smaller functions like function sun(); etc, and make the code as simple to comprehend as possible.

Story:

The general story was of a man trying to reach home before it is about to start raining. As cloud cover increases, he rushes home and rings the doorbell.

Sounds

I looked for short sound clips, not more than 03 seconds and successfully created a local server and preloaded them into the code.

Towards the final image

Initially, I was loading all the motion and sound simultaneously, meaning everything started at the same frameCount. But eventually I used my sketches to plan my story and have objects appear one after another. I also used the walking man from last week’s assignment, to learn how I was able to manage the variables of several objects and shapes, without having runtime errors. This was helpful to revisit, tweak the code from the old assignment and recontextualise it!

Process Video

In the initial version, my project was very noise and glitchy, but after several iterations, I was able to improve it significantly.

Mid-Process Trial Video:

Final Video after making edits:

Note: I have not embedded the code, because the sounds are on local servers and I wasn’t able to load them into the p5.js web editor.

LO-10: 1935 by Florian Hecker

“What do machines hear that humans cannot?”

For this week’s Looking Outwards, I looked up several artists and composers who work with sound as a part of their artwork. For all these artists I came across, sound is material to experiment with and create artistic expression. While several sound artists Rie Nakajima build objects as a part of the installation that create sound effects the audience can experience. Sound artists design sound for different spaces and environments which determines the quality and style of sound art.  

While reflecting on the differences between electronic music and sound art, I learnt that many sound artists use environments and physical objects to make sounds. But computer music is unique in that sense. I research the computer musician Florian Hecker – specifically with his work 1935. I chose this work, because it does justice to expressing the medium it is creating in the overall effect of the sound.

1935 by Florian Hecker:

The final project is a soundscape that varies in modulation as different data is used as input.

In the description of this video, it describes how the sound itself embodies the listening behavior of machines.He exhibits how different sound generated by adding different inputs to the computer can show measures of abstraction and scales of resolution. He also tends to personify machines and asks, “What do machines hear that humans cannot?”. He therefore truly create an effect of a different type of non-human listening, and that computers listen differently than us.

Florian Hecker is a media artist who is affiliated to Edinburgh College of Art, the MIT art program and also showcases his work and installations at leading art galleries.

Blog link: http://florianhecker.blogspot.com/

Project 10 – Sound Story

sketchDownload
// Storyline: 4 fruits (an orange, a banana, a red apple, and a green apple)
    // are inside a fruitbowl together and the fruit keep disappearing!!
    // I'm still having trouble with setting up a local server
    // Right now these sounds are attached to the web server chrome method
    // But it doesn't seem to be working and I don't know why
var x; //x position of the fruits
var y; //y position of the fruits
var dx = 4 //starting speed the fruits fly away at for x
var dy = 4 //starting speed the fruits fly away at for y
var orangeScream;
var bananaScream;
var redAppleScream;
var greenAppleScream;

function preload() {
    orangeScream = loadSound ("http://127.0.0.1:8887/orangeScream.wav");
    bananaScream = loadSound ("http://127.0.0.1:8887/bananaScream.wav");
    redAppleScream = loadSound ("http://127.0.0.1:8887/redAppleScream.wav");
    greenAppleScream = loadSound ("http://127.0.0.1:8887/greenAppleScream.wav");
}


function setup() {
    createCanvas(500, 300);
    frameRate(1);
    useSound();
}


function soundSetup() { 
    orangeScream.setVolume(0.25);
    bananaScream.setVolume(0.25);
    redAppleScream.setVolume(0.25);
    greenAppleScream.setVolume(0.25);
}

function orange(x, y) {
    fill(255, 137, 0);
    noStroke();
    circle(x, y, 50);
}

function banana(x, y) {
    fill(255, 255, 0);
    noStroke();
    beginShape();
    curveVertex(x, y);
    curveVertex(x, y);
    curveVertex(x + 20, y + 45);
    curveVertex(x - 30, y + 80);
    curveVertex(x - 5, y + 45);
    curveVertex(x, y);
    curveVertex(x, y);
    endShape(CLOSE);
}

function apple(x, y) {
    noStroke();
    beginShape();
    curveVertex(x, y);
    curveVertex(x, y);
    curveVertex(x + 15, y - 10);
    curveVertex(x + 25, y);
    curveVertex(x + 20, y + 40);
    curveVertex(x, y + 30);
    curveVertex(x - 20, y + 40);
    curveVertex(x - 25, y);
    curveVertex(x - 15, y - 10);
    curveVertex(x, y);
    curveVertex(x, y);
    endShape(CLOSE);
}


function draw() {
    background(0, 0, 255);
    fill(205, 0, 255);
    noStroke();
    arc(250, 150, 300, 300, TWO_PI, PI, CHORD);
    //when a fruit disappears, it screams
    print(frameCount);
    if (frameCount <= 8) {
        orangeScream.play();
        orange(140 - dx, 140 - dy);
    } else if (frameCount <= 16) {
        bananaScream.play();
        banana(200 - dx, 90 - dy);
    } else if (frameCount <= 24) {
        redAppleScream.play();
        fill(255, 0, 0);
        apple(270 + dx, 120 - dy);
    } else {
        greenAppleScream.play();
        fill(0, 255, 0);
        apple(340 + dx, 120 - dy);
    }
    dx *= 2;
    dy *= 2;
}

Looking Outwards 10 – Computational Music

The project I am looking at for this week is Charlie Puth’s Attention (it’s a song). Charlie Puth is not a classically trained musician, so a lot of his self-produced music is made through computational software. For instance, for the verse of Attention, he recorded his voice just humming out the melody on voice memos on his phone. After uploading it to Pro Tools, he could choose the instrument he wanted to play that melody, and then go into a graph editor and change how that melody sounded without ever needing to pick up an instrument. He also was able to change the quality of the sound to achieve a different emotional effect subconsciously. He added tape cracks into the background to give the impression of analog music. I admire this project because it widens the sphere of accessibility to make music. As a person who always picked up instruments without ever truly succeeding at them, this is very appealing to me.

http//www.avid.com/pro-tools#Music-Creation

LO: Computer Music

Iamus: Hello World! (first piece composed by Iamus)

While I was doing some exploring in Computer Music, I stumbled upon Iamus. Iamus is a computer that can write contemporary classical music scores. It needs only 8 minutes to create a full composition in different musical formats. Iamus is taught the basics of human composition, such as the limitations of what can be played. Iamus is constantly evolving as more source material is added to the software, similar to how a musician only grows with more practice. Iamis is inspired by evolution, as it picks and alters the source material to create complex music pieces. So far, it can only compose contemporary classical music, but it has the potential to evolve and compose other genres of music. It is so weird to think about how a computer can compose music to the same level as the composers we admire. However, there is the question of whether artificial intelligence can match the authenticity of humans, especially the drive and passion of composers translated into music.

Iamus: http://melomics.com/iamus

Project 10: Sonic Story

sketch
/*
Bon Bhakdibhumi
bbhakdib
Section D
*/
var nightBarn;
var chicken;
var sun;
var moon;
var crowing;
var morningSound;
var wakingUp;
var switchSound;
var chickenSleeping;
var nightSound;
var snoring;
var angle = 100;
var frameCounter = 0;
var rotationCounter = 0;
function preload () {
    dayBarn = loadImage("https://i.imgur.com/aGadSXz.png");
    nightBarn = loadImage("https://i.imgur.com/6cfic4A.png");
    chicken = [loadImage("https://i.imgur.com/4ZuBHL3.png"), 
                loadImage("https://i.imgur.com/YVhA3Pg.png")];
    sun = loadImage("https://i.imgur.com/64xX4qN.png");
    moon = loadImage("https://i.imgur.com/O8y3IzI.png");
    crowing = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/435507__benjaminnelan__rooster-crow-2.wav");
    morningSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/530265__dominictreis__morning-transition-music.wav");
    wakingUp = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/401338__ckvoiceover__yawning.wav");
    switchSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/414438__inspectorj__light-pulley-switch-on-c.wav");
    chickenSleeping = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/233093__jarredgibb__chicken-buck-96khz.wav");
    nightSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/200167__ebonny__cicadas.wav");
    snoring = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/409015__marisca16__old-man-snoring.wav");
}

function setup() {
    createCanvas(400, 400);
    frameRate(15);
    useSound();
}

function soundSetup() {
        morningSound.setVolume(0.1);
        crowing.setVolume(0.5);
        wakingUp.setVolume(0.3);
        switchSound.setVolume(0.9);
        chickenSleeping.setVolume(0.5);
        nightSound.setVolume(0.9);
}

function draw() {
    imageMode(CENTER);
// draw daytime
    if (rotationCounter % 2 == 0) {
        background(145, 203, 229);
        push();
        translate(215, 225);
        rotate(radians(angle));
        image(sun, 80, 200, 400, 400);
        pop();
        image(dayBarn, 200, 200, 400, 400);
        image(chicken[0], 80, 240, 200, 200);
    } else {
// draw nighttime
        background(51, 60, 99);
        push();
        translate(215, 225);
        rotate(radians(angle));
        image(moon, 80, 200, 400, 400);
        pop();
        image(nightBarn, 200, 200, 400, 400);
        image(chicken[1], 80, 240, 200, 200);
    }
    if (angle == 275) {
// reset sun or moon rotation
        angle = 110;
        rotationCounter ++;
    }
    angle ++;
//morning sounds
    if (angle == 125 & rotationCounter % 2 == 0) {
        crowing.play();
    }
   if (angle == 145 & rotationCounter % 2 == 0) {
        morningSound.play();
    }
    if (angle == 170 & rotationCounter % 2 == 0) {
        wakingUp.play();
    }
//night sounds
    if (angle == 115 & rotationCounter % 2 == 1) {
        nightSound.play();
    }
    if (angle == 115 & rotationCounter % 2 == 1) {
        switchSound.play();
    }
    if (angle == 120 & rotationCounter % 2 == 1 || angle == 150 && rotationCounter % 2 == 1) {
        chickenSleeping.play();
    }
    if (angle == 122 & rotationCounter % 2 == 1) {
        snoring.play();
    }
}

For this project I decide to make a story that illustrates the day at a barn.