Minjae Jeong-looking outwards-10-computational music

For this week’s looking outwards, I found a Tedx talk by Ge Wang, who makes computer music. He uses a programming language called “Chuck,” and what surprised me the most in the beginning of his lecture was that I expected the software to be something similar to Logic X pro and Qbase which are professional producing softwares, but he was literally “coding”
to generate a sound. Although the basic demonstration was very simple but with the use of technology, Stanford laptop orchestra performs a piece of music with each laptop as an instrument. One of the most attractive thing about computational music to me is the ability to generate any sound, and with such ability, computational music can really create any music or sound that the “composer” wants to express.

Nadia Susanto – Project 10 – Sonic Sketch

sketch

// Nadia Susanto
// nsusanto@andrew.cmu.edu
// Section B
// Project-10-Interactive Sonic Sketch


function preload() {
    //loaded image from imgur
    TigerWoodsImg = loadImage("https://i.imgur.com/ETVJsHl.jpg");
    golfhitSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/golfhit.wav");
    tigerRoarSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/tigerroar.wav");
    golfBallCupSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/golfballincup.wav");
    cheeringSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/cheering.wav");
}


function setup() {
    createCanvas(600, 600);
    TigerWoodsImg.resize(600, 600);
    useSound();
}


function soundSetup() { // setup for audio generation
    golfhitSound.setVolume(1);
    tigerRoarSound.setVolume(1);
    golfBallCupSound.setVolume(1);
    cheeringSound.setVolume(1, 3);
}


function draw() {
    background(200);
    image(TigerWoodsImg, 0, 0);
}

function mousePressed() {
    //sound of a golf ball being hit when clicked on the caddy
    if (mouseX > 400 & mouseX < 500 && mouseY > 400) {
        golfhitSound.play();
    }
    else {
        golfhitSound.pause();
    }

    //sound of cheering when clicked on the crowd behind the green
    if (mouseY < 300 & mouseY > 150) {
        cheeringSound.play();
    }
    else {
        cheeringSound.pause();
    }

    //sound of a tiger roar when clicked on Tiger Woods
    if (mouseX < 300 & mouseX > 200 && mouseY > 350) {
        tigerRoarSound.play();
    }
    else {
        tigerRoarSound.pause();
    }

    //sound of a golf ball going in the hole when clicked on flag
    if (mouseX < 330 & mouseX > 300 && mouseY > 250 && mouseY < 320) {
        golfBallCupSound.play();
    }
    else {
        golfBallCupSound.pause();
    }
}

In the spirit of Tiger Woods winning his 82nd PGA Tour win this past weekend, I wanted to use a picture of him at Augusta National for the Masters and incorporate multiple sounds. I included a tiger roar from the animal itself when you click on Tiger Woods, a sound of the golf ball being hit when clicked on Tiger’s caddy on the right, a sound of the crowd cheering when clicked on the many people in the back of the green, and the sound of a golf ball going into the hole when clicked on the yellow flag on the green.

Julia Nishizaki – Looking Outwards – 10

“Hello, World,” Iamus’s first complete composition, 2011

This week, I chose to look into the San Francisco startup, Melomics Media, and their computational system for automatically composing music. Melomics has created two “computer-musicians,” Iamus and Melomics109. Iamus is a computer cluster which is currently located at the Universidad de Málaga in Spain, where it was developed in 2010. Iamus composed “Opus One” on October 15, 2010, which was the first fragment of professional contemporary classical music to be composed by a computer in its own style, as it was not attempting to copy a previous composer’s work. A year later, “Helo, World,” Iamus’s first complete composition premiered, and in 2012, the London Symphony Orchestra recorded 10 of Iamus’s pieces, creating “Iamus,” the first studio album composed using this computational system.

The Iamus computer cluster

It takes Iamus 8 minutes to create a new composition and to output this data into multiple formats. According to the Universidad de Málaga’s website, the algorithm that Iamus uses is built on data-structures that act as genomes in order to create possible compositions.

While listening to “Hello, World” I was surprised by both how contemporary and dissonant the piece sounded, and how an entire, fairly coherent piece of chamber music could be composed by a computer. However, the constant tension in the piece, combined by the very human musicians and their interpretations gives “Hello, World” an uncanny valley feel, because the piece is technically music, but something still seems slightly off. I’m curious as to why Melomics decided to go in this direction, rather than to create music that is composed of “new” sounds and is entirely unplayable by humans.

Claire Lee – Looking Outwards – 10

I decided to write about a short musical piece created by computer music artist and pioneer Laurie Spiegel called Strand of Life (‘Viroid’). I did listen to several other pieces in her album Unseen World, but Strand of Life was really fascinating to me because of both the conception and the result. This piece also interested me because it was so similar in concept to the sound art piece I wrote about in my Looking-Outwards-04, Pierry Jacquillard’s Prélude in ACGT, so I was interested to see each artist’s unique approach to the concept of generating music with patterns derived from genetic material.

Spiegel got the idea for this piece while she was sick with an infection-taking that as inspiration, she converted a viroid’s DNA into midi data and used that as the base for her musical track. I imagine that the algorithm behind the DNA-midi conversion was relatively straightforward, but she incorporates other elements (vocals, instrumentals) that were also partly computer-generated that make the piece much more complex than the sound art I wrote about previously.

Ankitha Vasudev – Project 10 – Interactive Sonic Sketch


sketch

// Ankitha Vasudev
// Section B
// ankithav@andrew.cmu.edu
// Project 10 - Sonic Sketch

//global variables
var rx = 80;      //x position of stereo
var ry = 150;     //y position of stereo
var stereoImg;    //stereo image
var radiosong;    //slow song
var catchysong;   //fast song
var static;       //static/interference sound
var phonering;    //phone ring sound
var switcheffect; //play/pause switch sound effect
var Amp = 0;

// preloading sounds and image
function preload() {

    // stereo image
    var ImageURL = "https://i.imgur.com/MX0qMoE.jpg"
    stereoImg = loadImage(ImageURL);

    // Loading five sounds
    radiosong = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/basic.wav");
    catchysong = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/bg.mp3");
    static = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/interference.wav");
    phonering = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/ringing.wav");
    switcheffect = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/switch.wav");
}

function setup() {
    createCanvas(450, 450);
    background(0);
}

function soundSetup() {
    //volume for individual sounds
    radiosong.setVolume(1);
    catchysong.setVolume(0.5);
    static.setVolume(0.5);
    phonering.setVolume(0.4);
    switcheffect.setVolume(0.6);
}

function draw() {
    noStroke();

    // grey background behind stereo 
    fill(220);
    rect(0, 0, width, 300);

    // brown table
    fill(130, 80, 50);
    rect(0, ry+150, width, 200);

    // phone on table
    push();
    translate(75, 410);
    rotate(5);
    fill(80);
    rect(0, 0, 50, 90);
    fill(240);
    rect(5, 5, 40, 80);
    fill(30);
    rect(5, 10, 40, 70);
    pop();

    // antennae behind stereo
    fill(75);
    rect(350, 40, 5, 150);

    // stereo img
    image(stereoImg, rx, ry);

    // pause button 
    fill(200, 60, 60);
    rect(width/2+10, 200, 15, 20);
    stroke(30);
    strokeWeight(2);
    line(width/2+16, 205, width/2+16, 215);
    line(width/2+20, 205, width/2+20, 215);

    // play button 
    noStroke();
    fill(200, 60, 60);
    rect(width/2-10, 200, 15, 20);
    fill(50);
    triangle(width/2-7, 205, width/2-7, 215, width/2+2, 210);
}

function mousePressed() {

    //Play music when play button is pressed - switch between two songs
    if (mouseX>=(width/2)-10 & mouseX<=(width/2)+5 & mouseY<=220 & mouseY>=200) {
        if (radiosong.isLooping()) {
            catchysong.loop();
            radiosong.pause();
        } else {
            switcheffect.play();
            radiosong.loop();
            catchysong.pause();
        }
    }
    

    //Click on pause switch to stop music
    if (mouseX>=width/2+10 & mouseX<=width/2+25 & mouseY<=220 & mouseY>=200) {
        if (catchysong.isLooping || radiosong.isLooping) {
            switcheffect.play();
            catchysong.pause();
            radiosong.pause();
        }
    }


    //Play static when antennae is clicked on
    if (mouseX>=350 & mouseX<=355 & mouseY<=190 && mouseY>=40) {
        if (static.isLooping()) {
            static.pause();
        } else {
            catchysong.pause();
            radiosong.pause();
            static.loop();
        } 
    }   else {
            static.pause(); 
        }


    //Play phone ring when phone is clicked on
    if (mouseX>100 & mouseX<160 & mouseY>375 && mouseY<440) {
        if (phonering.isLooping()) {
            phonering.pause();
        } else {
            catchysong.pause();
            radiosong.pause();
            static.pause();
            phonering.loop();
        }  
    }
}

This project was tricky but fun to create. I decided to create a stereo with different sound effects that can play multiple songs when clicked on. Overall, there are five sounds in this project.

When the play button is pressed a song is played and can be changed to the next song by clicking the button again. The pause button stops the music. I added a clicking sound effect every time one of the buttons are pressed to make it more realistic.When the antennae is clicked a static noise plays and can be stopped by clicking anywhere else on the canvas. When the phone is clicked on, a ringing noises plays and can be stopped by re-clicking on the phone. 

Emma NM-Project-10(Interactive Sound)

To hear the animal’s sound, click on the image. The four sounds are a dog barking, duck quacking, cat meowing, and cow mooing. To turn the sound off, click the image again.

sound

/* 
Emma Nicklas-Morris
Section B
enicklas
Project-10
Interactive Sound
*/


var cowSound;
var duckSound;
var catSound;
var dogSound;

var dogImg;
var cowImg;
var duckImg;
var catImg;
var adj = 10;


function preload() {
    // load my 4 animal sounds and images
    cowSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/cow.wav");
    duckSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/duck.wav");
    catSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/cat.wav");
    dogSound = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/dog.wav");

    catImg = loadImage("https://i.imgur.com/nliPYUx.jpg");
    duckImg = loadImage("https://i.imgur.com/mwzKbS9.jpg");
    cowImg = loadImage("https://i.imgur.com/u6LpEOD.jpg");
    dogImg = loadImage("https://i.imgur.com/0tT7kPQ.jpg");

}


function setup() {
    createCanvas(500, 400);
    useSound();

}


function soundSetup() {
    cowSound.setVolume(.5);
    catSound.setVolume(2);

}


function draw() {

    background("lightblue");

    // display the 4 animal images
    image(catImg, width/2 - adj, 0, catImg.width/2, catImg.height/2);
    image(duckImg, 0, 0, duckImg.width/5, duckImg.height/5);
    image(dogImg, -3 * adj, height/2, dogImg.width/2, dogImg.height/2);
    image(cowImg, width/2 + adj + 6, height/2, cowImg.width/1.5 , cowImg.height/1.5);

}



function mousePressed() {
    // when you click on duck picture, play quack sound
    // to turn off, click again.
    if ((mouseX < duckImg.width/5) & (mouseY < duckImg.height/5)) {
        if (duckSound.isLooping()) {
            duckSound.stop();
        }
        else {
            duckSound.loop();
        }

    }

    // when you click on cat picture, play meow sound
    // to turn off, click again.
    else if ((mouseX > duckImg.width/5) & (mouseY < height/2)) {
        if (catSound.isLooping()) {
            catSound.stop();
        }
        else {
            catSound.loop();
        }

    }

    // when you click on dog picture, play barking sound
    // to turn off, click again.
    else if ((mouseX < width/2 + adj + 6) & (mouseY > duckImg.height/5)) {
        if (dogSound.isLooping()) {
            dogSound.stop();
        }
        else {
            dogSound.loop();
        }

    }

    // when you click on cow picture, play mooing sound
    // to turn off, click again.
    else if ((mouseX > width/2 + adj + 6) & (mouseY > height/2)) {
        if (cowSound.isLooping()) {
            cowSound.stop();
        }
        else {
            cowSound.loop();
        }

    }
}

I struggled a lot to get my sounds to work, most of it was something I didn’t quite understand with how p5.js needs to have the sound library called in the html. I also learned that sounds take up a lot of browsing cache, so to help keep your page refreshing properly, you need to clear your cache often when using sounds. My idea is geared towards a pre-school/kindergarten activity. It would allow them to learn what animals make what sounds.

Xu Xu – Looking Outwards – 10

For this week’s looking outwards, I decided to focus on an algorithmic sound art called “I am sitting in a machine” by Martin Backes. The work first begins with a recording of an artificial human voice reciting a text, which is run through an MP3 encoder over and over again. Through each iteration of the loop, the artifacts of the encoding process reinforce themselves and gradually distorts the artificial human voice, thus revealing its data format. This piece of work is a homage to composer Alvin Lucier’s sound art piece “I am sitting in a room” in 1969, but in a computational way. “I am sitting in a room” features similar ideas, where an recording is played over and over again, due to the emphasis of certain frequencies in the room, slowly the words become unintelligible, replaced by the pure resonant harmonies and tones of the room itself.

Alvin Lucier’s work explores the physical properties of sound, the resonance of spaces and the transmission of sound through physical media; whereas Backes’ work is more about digitized information and its artifacts, hearing science and telecommunications. He wanted to show how digitized information produces unexpected phenomena the same way physical environments do. He explains how he achieved this phenomena through computational techniques: “I have rewritten the original lyrics from the perspective of a machine. As a next step, I used the artificial human voice of a text-to-speech function and recorded the text via a script. I then wrote another script and ran the recording into a MP3 encoder automatically, over and over again. By the help of this recursive algorithm, I produced 3000 successive iterations of the 128 kbps 44.1 kHz MP3 encoding.

I admire this project because it creates a connection between the computational and physical world, revealing that similar phenomena are able to occur in both situations. There is also a web version of this sound art online: I am sitting in a machine

Xiaoyu Kang – Looking Outwards – 10

The project that I looked at is named Data Peluda. It is a performance done by Jorge Chikiar and Luis Conde at Roseti in Buenos Aires, on August 11, 2017. Jorge Chikiar is a composer and sound artist from Argentina. He has worked at many places such as Colon Theater, CETC, and Michell  Maccarone’s Art Gallery. He has been experimenting with different ways to present music for many years, and many of his project involves the use of different kinds of computer technologies.

This performance itself used a combination of saxophone and computer technologies. The music that the audience heard is the sound of the saxophone modified electronically by the computer. The processed sound turns out to be a combination of classic instrumental music and contemporary music. The most impressive part of this performance is that the music is produced live, which means that the process of modifying the saxophone music happened at the same time as the saxophone is played. I found this to be a great example of how computers are used in live music performances.

Ankitha Vasudev – Looking Outwards – 10

Orchestrion is a computerized band that was programmed and created by Eric Singer in 2010. Singer is a Brooklyn-based musician and technologist who founded SingerBots and LEMUR – a group of artists and technologists who create robotic musical instruments. Orchestrion consists of a set of automated musical instruments that are mechanically equipped to perform a composition.

Lido Orchestrion, 2010

I find this project interesting because the instruments in orchestrion can play anything that is composed for them. This means that a musician composes a song on basic production software, but instead of playing the notes back, the program activates the physical playing actions on the orchestrion. The video below shows the Lido Orchestrion, which was built for a nightclub in Paris and consists of 45 automated instruments. 

In 2009, Singer attended Carnegie Mellon as an undergrad— and founded SingerBots, a company fully dedicated to building robotic music instruments. Singer beleives that musicality and infallibility are the two priorities for an orchestrion, so that they sound good and do not make mistakes. I agree with his belief that robotic infallibility could create a lively performance, contrasting to others’ beliefs. 

A video describing Singer’s Orchestrions

Chelsea Fan-Project 10-Sonic-Sketch

SonicSketch

/* Chelsea Fan
Section 1B
chelseaf@andrew.cmu.edu
Project-10
*/
//important variables
var myWind;
var myOcean;
var myBirds;
var mySands;

function preload() {
    //load ocean image 
    var myImage = "https://i.imgur.com/cvlqecN.png"
    currentImage = loadImage(myImage);
    currentImage.loadPixels();
    //loading sounds
    //sound of wind
    myWind = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/winds.wav");
    myWind.setVolume(0.1);
    //sound of ocean
    myOcean = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/oceans.wav");
    myOcean.setVolume(0.1);
    //sound of birds
    myBirds = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/birds.wav");
    myBirds.setVolume(0.1);
    //sound of sand
    mySand = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/sand.wav");
    mySand.setVolume(0.1);
    //birds https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/birds.wav
    //oceans https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/oceans.wav
    //sands https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/sand.wav
    //winds https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/10/winds.wav
}

function soundSetup() { // setup for audio generation
}

function setup() {
    createCanvas(480, 480);
}

function sandDraw() {
    noStroke();
    //sand background color
    fill(255, 204, 153);
    rect(0, height-height/4, width, height/4);
    //sand movement
    for (i=0; i<1000; i++) {
        var sandX = random(0, width);
        var sandY = random(height-height/4, height);
        fill(255, 191, 128);
        ellipse(sandX, sandY, 5, 5);
    }
}

var x = 0;
var cloudmove = 1;

function skyDraw() {
    noStroke();
    //sky color
    fill(179, 236, 255);
    rect(0, 0, width, height/2);
    //cloud color
    fill(255);
    //cloud move
    x = x + cloudmove;
    if(x>=width+100){
        x = 0;
    }
    //cloud parts and drawing multiple clouds in sky section 
    for (i=0; i<=4; i++) {
        push();
        translate(-200*i, 0);
        ellipse(x + 10, height / 6, 50, 50);
        ellipse(x + 50, height / 6 + 5, 50, 50);
        ellipse(x + 90, height / 6, 50, 40);
        ellipse(x + 30, height / 6 - 20, 40, 40);
        ellipse(x + 70, height / 6 - 20, 40, 35);
        pop();
    }
}
function birdDraw() {
    noFill();
    stroke(0);
    strokeWeight(3);
    //Birds and their random coordinates (not randomized 
    //because I chose coordinates for aesthetic reasons)
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    push();
    translate(-110, 0);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(-100, 80);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(-30, 40);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(70, 50);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(100, 100);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(150, 25);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(200, 75);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
    push();
    translate(250, 13);
    curve(100, 150, 120, 120, 140, 120, 160, 140);
    curve(120, 140, 140, 120, 160, 120, 180, 150);
    pop();
}
function draw() {
    //draw sand 
    sandDraw();
    //draw ocean
    image(currentImage, 0, height/2);
    //draw sky
    skyDraw();
    //draw birds
    birdDraw();
    //implement sound when mouse is pressed
    mousePressed();
}
function mousePressed() {
    //if mouse is in section of canvas where clouds are
    if (mouseIsPressed & mouseY>=0 && mouseY<=height/4) {
        //sound of wind
        myWind.play();
    }
    //if mouse is in section of canvas where birds are
    if (mouseIsPressed & mouseY>height/4 && mouseY<=height/2) {
        //sound of birds
        myBirds.play();
    }
    //if mouse is in section of canvas where ocean is
    if (mouseIsPressed & mouseY>height/2 && mouseY<=3*height/4) {
        //sound of waves
        myOcean.play();
    }
    //if mouse is in section of canvas where sand is
    if (mouseIsPressed & mouseY>3*height/4 && mouseY<=height) {
        //sound of sand
        mySand.play();
    }
}

My code has four different sounds (sounds of wind, birds, waves, and sand). Each is enabled by clicking on the respective quarter of the canvas. For example, the wind sound is enabled by clicking the top layer where the clouds are located.

This took me a very long time because I couldn’t get the sounds to work. But, the idea of having an ocean landscape with different sounds came quickly to me.