Sarah Kang – Looking Outwards – 10

A new algorithm produces the “portamento” effect, from news.mit.edu

Trevor Henderson is an MIT student in computer science who has invented a new algorithm that produces a “portamento” effect – the effect of gliding a note at one pitch into a note of a lower or higher pitch – between any two audio clips. His new algorithm is based on a geometry-based framework that facilitates the most productive paths to move data points between more than one origin and destination configurations. Henderson applies this optimal transport to interpolating audio signals, which blends the signals or sounds into each other.

This project really intrigued me because I had never really focused on the transformations of sounds and it was amazing how the two different sound categories morphed into each other. I feel like this would open up a lot of opportunities for computational music in the future.

Monica Chang – Looking Outwards – 11

Rosa Menkman

Her website: http://rosa-menkman.blogspot.com/

About page on Menkman’s website
Intro page, Rosa Menkman

Rosa Menkman is a Dutch curator, visual artist and researcher who specializes in digital media and analogue – specifically noise artifacts: glitches, encoding and feedback artifacts. With her artwork, she emphasizes the idea that the process of imposing efficiency, order and functionality does not involve the creation of procedures and solutions, but utilizes ambiguous compromises and the forever unseen and forgotten.

‘Xilitla’ by Rosa Menkman

Menkman is considered to be one of the most iconic video glitch artists as she often utilizes software glitches to develop her stunning pieces. One of her algorithmic pieces, ‘Xilitla’, is a hallucinatory, futuristic 3D architectural environment formed by polygons and other unconventional objects. Using game-like functions, the viewer is allowed to navigate through this graphic landscape using the head-piece in the center. Menkman also considers this particular piece to be one that would best describe her body of other works.

Mari Kubota- Looking Outwards- 10

The Classyfier, created by Benedict Hubener, Stephanie Lee and Kelvyn Marte at the CIID, is a table that detects the beverages people consume around it and chooses music that fits the situation accordingly.

A built in microphone catches characteristic sounds and then compares these sounds to a catalogue of pre-trained examples. The Classyfier identifies it as belonging to one of three classes; hot beverages, wine or beer. Each class has its own playlist that one can navigate through by knocking on the table.

The idea behind this project was to build a smart object that uses machine learning and naturally occurring sounds as input to enhance the ambiance of different situations. The main tools used were Wekinator, Processing and the OFX collection.

William Su – Project 10 – Sonic Sketch

For this project I saw the opportunity to make a simple game. In this case, a very cruddy p5 version of CS:GO. Enjoy!

sketch

//William Su
//wsu1@andrew.cmu.edu
//Section E
//Project 10 

var bgurl = "https://i.imgur.com/CYEy7BV.jpg"; //background.
var muzzleFlash = "https://i.imgur.com/68jJZkV.png"; //muzzle flash from the tip of gun.
var mfTrue = false; //currently false if no click;
var T1Alive = true; //Whether enemies are currently alive or not.
var T2Alive = true;
var T3Alive = true;
var Tenemy = "https://i.imgur.com/Nh8X1RG.png"; //Enemy image
var bg; // Background img
var mf; // MuzzleFlash
var T1HitCount = 0; //Number of times each enemy has been hit.
var T2HitCount = 0;
var T3HitCount = 0;

function preload() {
    bg = loadImage(bgurl);
    mf = loadImage(muzzleFlash);
    T1 = loadImage(Tenemy);
    T2 = loadImage(Tenemy);
    T3 = loadImage(Tenemy);

    // Local File
    // AKshot = loadSound("assets/AK.mp3");
    // Death = loadSound("assets/Death.mp3");
    // Death2 = loadSound("assets/Death2.mp3");
    // Hit = loadSound("assets/Hit.mp3");
    AKshot = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/AK.mp3");
    Death = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/Death.mp3");
    Death2 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/Death2.mp3");
    Hit = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/Hit.mp3");
}


function setup() {
    createCanvas(720, 400);
    bg.resize(720,400);
    frameRate(60);
    noCursor();
}


function draw() {
    if (mfTrue == true) { //If mouse pressed, activate muzzle flash.
        mfTrue = false; //Reset muzzle flash as false.
        image(bg, 0, 0);
        image(mf, 230, 30); //Draw muzzle flash and background.
    } else {
        image(bg, 0, 0);
    }

    //Draw the three enemies.
    if (T1Alive == true) {
        push();
        image(T1,180,180);
        T1.resize(35,70);
        pop();
    } 

    if (T2Alive == true) {
        push();
        image(T2,500,180);
        T2.resize(25,50);
        pop();
    } 

    if (T3Alive == true) {
        push();
        image(T3,370,192);
        T3.resize(13,26);
        pop();
    } 

    if (T1Alive == false & T2Alive == false && T3Alive == false) { //Reset if all enemies are dead.
        T1Alive = true;
        T2Alive = true;
        T3Alive = true;
    }

    crosshairs(); //Draw Crosshair
}

function crosshairs() { //Crosshair 
    stroke("white");
    strokeWeight(2);
    line(mouseX,mouseY,mouseX+20,mouseY);
    line(mouseX,mouseY,mouseX-20,mouseY);
    line(mouseX,mouseY,mouseX,mouseY+20);
    line(mouseX,mouseY,mouseX,mouseY-20);
}

function mousePressed() {
    //Play gunshot every click.
    AKshot.play();
    mfTrue = true;
    
    //Enemy 1 Hitbox
    if(mouseX > 180 & mouseX < 205 && mouseY > 180 && mouseY < 250 && T1Alive == true) {

        if (T1HitCount <= 4) { //Need 5 hits to kill.
            Hit.amp(2.5);
            Hit.play();
            T1HitCount += 1;
        } else { //Play death
            Death.amp(3);
            Death.play();
            Hit.amp(2);
            Hit.play();
            T1Alive = false;
            T1HitCount = 0;
        }
    }

    //Enemy 2 Hitbox
    if(mouseX > 500 & mouseX < 525 && mouseY > 180 && mouseY < 230 && T2Alive == true) {
        if (T2HitCount <= 4) { //Need 5 hits to kill.
            Hit.amp(2.5);
            Hit.play();
            T2HitCount += 1;
        } else { //Play death
            Death2.amp(3);
            Death2.play();
            Hit.amp(2);
            Hit.play();
            T2Alive = false;
            T2HitCount = 0;
        }
    }

    //Enemy 3 Hitbox
    if(mouseX > 370 & mouseX < 383 && mouseY > 192 && mouseY < 218 && T3Alive == true) {
        if (T3HitCount <= 4) { //Need 5 hits to kill.
            Hit.amp(2.5);
            Hit.play();
            T3HitCount += 1;
        } else { //Play death
            Death.amp(3);
            Death.play();
            Hit.amp(2);
            Hit.play();
            T3Alive = false;
            T3HitCount = 0;
        }
    }
}

Charmaine Qiu – Looking Outwards – 10

Taryn Southern is a singer active on YouTube who creates her own contents. I find her most recent album particularly interesting as it was entirely composed and produced with artificial intelligence. In an interview with Taryn, she informed us that she used an AI platform called Amper Music to create the instrumentation of her songs. In her creating process, Taryn would decide factors such as the BPM, rhythm, and key, and the AI would generate possibilities for her. From there, she could select the pieces that she enjoyed, and arrange them along with the lyrics that she wrote herself. She states that the advantage of working with an AI is that it gives her a lot of control over what she desires, as opposed to a human partner who may not understand her intents. However, I personally find her songs generic and lack the emotions that move the audience. There are many things that artificial intelligence can achieve, but I believe that music that truly conveys human sentiments should fully origin from the minds of people.

“Break Free” – a song where the instrumentation was composed with AI

Carly Sacco-Project 10- Sonic Sketch

sketch

//Carly Sacco
//Section C
//csacco@andrew.cmu.edu
//Project 10: Interactive Sonic Sketch

var x, y, dx, dy;

function preload() {
	bubble = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/bubble-3.wav");
	boatHorn = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/boat.wav");
	bird = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/bird.wav");
	water = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/water-2.wav");
}
	
function setup() {
    createCanvas(640, 480);
    x = 200;
    y = 40;
    dx = 0;
    dy = 0;
	
	useSound();
}

function soundSetup() {
    osc = new p5.TriOsc();
    osc.freq(880.0); //pitch
    osc.amp(0.1); //volume
} 

function draw() {
	background(140, 216, 237);
	//ocean
	noStroke();
	fill(26, 141, 173);
	rect(0, height / 2, width, height);
	
	//fish head
	fill(50, 162, 168);
	noStroke();
	push();
	translate(width / 2, height / 2);
    rotate(PI / 4);
    rect(200, -100, 100, 100, 20);
	pop();
	
	fill(184, 213, 214);
	noStroke();
	push();
	translate(width / 2, height / 2);
    rotate(PI / 4);
    rect(215, -85, 70, 70, 10);
	pop();
	
	//fish eyes
	fill('white');
	ellipse(520, 355, 15, 25);
	ellipse(545, 355, 15, 25);
	
	fill('black');
	ellipse(520, 360, 10, 10);
	ellipse(545, 360, 10, 10);
	
	//fish mouth
	fill(227, 64, 151);
	noStroke();
	push();
	translate(width / 2, height / 2);
    rotate(PI / 4);
    rect(240, -60, 40, 40, 10);
	pop();
	
	fill(120, 40, 82);
	ellipse(533, 395, 30, 30);	
	
	//fins
	fill(209, 197, 67);
	quad(565, 365, 590, 325, 590, 430, 565, 400);
	quad(500, 365, 500, 400, 475, 430, 475, 325);
	
	//bubbles
	var bub = random(10, 40);
	fill(237, 240, 255);
	ellipse(575, 315, bub, bub);
	ellipse(550, 275, bub, bub);
	ellipse(580, 365, bub, bub);
	x += dx;
	y += dy;

	//boat
	fill('red')
	quad(40, 200, 250, 200, 230, 260, 60, 260);
	fill('white');
	rect(100, 150, 80, 50);
	
	fill('black')
	ellipse(110, 165, 15, 15);
	ellipse(140, 165, 15, 15);
	ellipse(170, 165, 15, 15);
	
	
	//birds
	noFill();
	stroke(141, 160, 166);
	strokeWeight(5);
	arc(475, 75, 75, 60, PI * 5 / 4, 2 * PI);
	arc(550, 75, 75, 60, PI, 2 * PI - PI / 5);
	
	arc(550, 100, 50, 35, PI * 5 / 4, 2 * PI);
	arc(600, 100, 50, 35, PI, 2 * PI - PI / 5);
	
	arc(430, 125, 50, 35, PI * 5 / 4, 2 * PI);
	arc(480, 125, 50, 35, PI, 2 * PI - PI / 5);
	
	//waterlines
	stroke(89, 197, 227);
	bezier(300, 400, 250, 300, 200, 500, 50, 400);
	bezier(50, 350, 75, 500, 250, 200, 300, 375);
}

function mousePressed() {
	//plays sound if clicked near boat
	if (mouseX < width / 2 & mouseY < height / 2) {
		boatHorn.play();
	}
	//plays bubbles if clicked near fish
	if (mouseX > width / 2 & mouseY > height / 2) {
		bubble.play();
	}
	//plays birds if clicked near birds
	if (mouseX > width / 2 & mouseY < height / 2) {
		bird.play();
	}
	//plays water if clicked near water
	if (mouseX < width / 2 & mouseY > height / 2) {
		water.play();
	}
}
	

For my project this week, I began by starting with my project 3, which was the fish and bubbles. I wanted to add more options than a bubble sound so I decided to add a boat, birds, and a water noise. I kept it simple, so that whenever you click in the area closest to each icon, the associated sound plays.

Sarah Kang – Project 10 – sonic sketch

sonic

//sarah kang
//section c
//sarahk1@andrew.cmu.edu
//project-10-sonic-sketch

function preload() {

    mySnd1 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/drum.wav")
    mySnd1.setVolume(0.5);
    mySnd2 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sweep.wav")
    mySnd2.setVolume(0.5);
    mySnd3 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound1.wav")
    mySnd3.setVolume(0.5);
    mySnd4 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound2.wav")
    mySnd4.setVolume(0.5);
    mySnd5 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound3.wav")
    mySnd5.setVolume(1);
    mySnd6 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound4.wav")
    mySnd6.setVolume(0.5);
    mySnd7 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound5.wav")
    mySnd7.setVolume(0.5);
    mySnd8 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound6.wav")
    mySnd8.setVolume(0.5);
    mySnd9 = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/sound7.wav")
    mySnd9.setVolume(0.5);
}


function setup() {
    createCanvas(400, 400);

    useSound();
}


function soundSetup() { // setup for audio generation
}


function draw() {
    background(200, 220, 250);

    //white square rims
    for (var y = 10; y < height + 20; y += 130) {
        for (var x = 10; x < width + 20; x += 130) {
            fill(255);
            noStroke();
            rect(x, y, 120, 120);
        }
    }

    fill(255, 237, 219);
    rect(20, 20, 100, 100); //top left square

    fill(255, 251, 181);
    rect(150, 20, 100, 100); //top middle square

    fill(213, 238, 242);
    rect(280, 20, 100, 100); //top right square

    fill(230, 252, 241);
    rect(20, 150, 100, 100); //middle left square

    fill(243, 230, 252);
    rect(150, 150, 100, 100); //center square

    fill(252, 230, 234);
    rect(280, 150, 100, 100); //middle right square

    fill(232, 237, 255);
    rect(20, 280, 100, 100); //bottom left square

    fill(222, 248, 252);
    rect(150, 280, 100, 100); //bottom middle square

    fill(241, 252, 230);
    rect(280, 280, 100, 100); //bottom right square
}

function mousePressed() {

    if(mouseX > 20 & mouseX < 120 && mouseY > 20 && mouseY < 120){
       mySnd1.play(); //top left square sound
   }
    if(mouseX > 150 & mouseX < 250 && mouseY > 20 && mouseY < 120){
       mySnd2.play(); //top middle square sound
   }
    if(mouseX > 280 & mouseX < 380 && mouseY > 20 && mouseY < 120){
       mySnd3.play(); //top right square sound
   }
    if(mouseX > 20 & mouseX < 120 && mouseY > 150 && mouseY < 250){
       mySnd4.play(); //middle left square sound
   }
    if(mouseX > 150 & mouseX < 250 && mouseY > 150 && mouseY < 250){
       mySnd5.play(); //center square sound
   }
    if(mouseX > 280 & mouseX < 380 && mouseY > 150 && mouseY < 250){
       mySnd6.play(); //middle right square sound
   }
    if(mouseX > 20 & mouseX < 120 && mouseY > 280 && mouseY < 380){
       mySnd7.play(); //bottom left square sound
   }
    if(mouseX > 150 & mouseX < 250 && mouseY > 280 && mouseY < 380){
       mySnd8.play(); //bottom middle square sound
   }
    if(mouseX > 280 & mouseX < 380 && mouseY > 280 && mouseY < 380){
       mySnd9.play(); //bottom left square sound
   }
}

I was inspired by the launchpad that Beca uses in the movie Pitch Perfect and wanted to use this format to experiment with the combination of different sounds.

Sarah Choi – Project 10 – Interactive Sonic Sketch

project-10

//Sarah Choi 
//Section D
//sychoi@andrew.cmu.edu
//Project-10-Interactive-Sonic-Sketch

//spiral
var angle = 0;
var bright = 255;
var n = 0;
var m = 0;
var x = 8 * n;
var y = 8 * m;

function preload() {
    // call loadImage() and loadSound() for all media files here
    sound1 = loadSound("birds.wav");
    sound2 = loadSound("thunder.wav");
    //sound3 = loadSound("springday.wav");
    sound4 = loadSound("lightshower.wav");
    sound1.setVolume(0.5);
    sound2.setVolume(0.5);
    //sound3.setVolume(0.5);
    sound4.setVolume(0.5);
}

function setup() {
    // you can change the next 2 lines:
    createCanvas(640, 480);
    createDiv("Interactive Sonic Sketch");

    //======== call the following to use sound =========
    useSound();

    
    rectMode(CENTER);
}

function soundSetup() { // setup for audio generation
    // you can replace any of this with your own audio code:
    osc = new p5.Oscillator();
    osc.freq(880);
    osc.amp(0.1);
    osc.start();
}

//--------------------------------------------------------------------

function draw() {
    background(0);
    noStroke();
    if (mouseIsPressed) {
        bright = 255;
    }
    background(bright);
    bright = bright - 5;

    fill(255, 0, 0);
    var m = max(min(mouseX, 200), 20);
    var size = m * 200.0 / 250.0;
    rect(10 + m * 150.0 / 350.0, 400.0,
         size, size);
    fill(0, 0, 255);
    size = 200 + size;
    circle(150, 50 + m * 150 / 250.0,
         size / 2, size / 2);

    push();
    fill(0, 255, 0);
    ellipseMode(CORNER);
    var n = max(min(mouseX, 300), 200);
    var size2 = n * 200.0 / 400.0;
    ellipse(400 , size2, size2 * 2, size2 * 4);
    pop();
    
    if (mouseIsPressed) {
        fill(255, 255, 0);
        noStroke();
        strokeWeight(5);
        translate(mouseX, mouseY);
        rotate(radians(10));
        triangle(100, 0, 150, 150, 175, 150);
        quad(175, 150, 150, 150, 200, 200, 250, 200);
        triangle(200, 200, 250, 200, 150, 450);
        angle = angle + 5;

        push();
        translate(mouseX, mouseY);
        rotate(radians(10));
        triangle(300, 0, 275, 150, 250, 150);
        quad(275, 150, 250, 150, 300, 200, 350, 200);
        triangle(300, 200, 350, 200, 250, 450);
        pop();
        angle = angle + 5;

        push();
        translate(mouseX, mouseY);
        rotate(radians(10));
        triangle(25, 0, 0, 50, -25, 50);
        quad(0, 50, -25, 50, 45, 100, 75, 100);
        triangle(45, 100, 75, 100, 25, 250);
        pop();
        angle = angle + 5;

        sound2.play();
        sound4.play();
    }

    if (x <= width) {
        n += 1;
        bezier(x+2, y+2, x, y+6, x+2, y+8, x+4, y+6, x+2, y+2);
    }
    else { 
        m += 1;
        bezier(x+2, y+2, x, y+6, x+2, y+8, x+4, y+6, x+2, y+2);
    }

    sound1.play();
    //sound3.play();
}

I chose these sounds as I thought they gave a good representation of Pittsburgh when it goes from being a nice spring day that suddenly comes with thunderstorms and a light rain shower.

Shariq M. Shah – Project 10


shariqs-project10

// Project - 10
// Name: Shariq M. Shah
// Andrew ID: shariqs
// Section: C


var travis;
var heavenly;
var kick;
var explosion;

//loading different sounds
function preLoad() {
    travis = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/travisscott.wav");
    travis.setVolume(0.2);

    heavenly = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/heavenly.wav")
    heavenly.setVolume(0.4);

    kick = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/kick.wav")
    kick.setVolume(0.2);

    explosion = loadSound("https://courses.ideate.cmu.edu/15-104/f2019/wp-content/uploads/2019/11/explosion.wav")
    explosion.setVolume(0.2);
}


function setup() {
   createCanvas(400,300);
   useSound();
}

function soundSetup() {
    osc = new p5.TriOsc();
    osc.freq(100.0);
    osc.amp(0.1);
    osc.start();
}

function draw() {

  background(0);

  for (var i = 0; i < 100; i += 1) {
      //defining a rotating series of lines that converge in patterns
      //using frameCount to have rotations and colors change over time
      strokeWeight(0.4);
      translate(width / 2, height / 2 + 100);
      rotate(radians(180 + 0.1 * frameCount));
      //various configurations and colors of lines that change according to stepping i variable
      //mouseY used to alter color and configurations depending on mouse location
      stroke(mouseY, 0, 250);
      line(i + i * width/10,  -height * 0.1 * mouseY, width/2, height);

      stroke(mouseY, 0, 250);
      line(i + i * -width/10,  -height * 0.1 * mouseY, width/2, height);
    }
      //amplitude and frequency change based on loaction of mouse x and y
      var freq = map(mouseX, 0, width, 40, 100);
      osc.freq(freq);

      var amp = map(mouseY, 0, height, 0.2, 0.05);
      osc.amp(amp);
}

//depending on where user presses mouse, a different brooding sound is played according to the relative amplitude and frequency at the location
function mousePressed() {

    if(mouseX > 10 & mouseY < height / 2) {
      travis.play(0, 1, 2);
    }

    if(mouseX > width/2 & mouseY < 200) {
      heavenly.play(0, 1, 2);
    }

    if(mouseY > 50 & mouseY < 100) {
      kick.play(0, 1, 2);
    }

    if(mouseX > 300 & mouseY > 200) {
       explosion.play(0, 1, 2);
    }


}

In this project, I experimented with a variety of sounds and used a differing mouseX and mouseY location to change the auditory experience when the mouse is clicked.  I did this using if statements that change the sound that is played based on where the mouse is clicked. This became more interesting once the form of the rotating lines responded to the fluctuating soundscape, both in color and in geometry. Overall, this was a great project with which I could experiment with creating different soundscapes in a program. 

Shannon Ha – Looking Outwards – 10

Photo taken from https://itp.nyu.edu/classes/cc-s18/gesture-controlled-musical-gloves/

The mi.mu gloves can be defined as a wearable musical instrument for expressive creation, composition, and performance. These gloves were the creation of music composer and songwriter Imogen Heap, who wrote the musical score for Harry Potter and the Cursed Child. Her aim in creating these gloves is to push innovation and share resources. She believes that these gloves will allow musicians and performers to better engage fans with more dynamic and visual performances, simplify the hardware that controls music (laptops, keyboards, controllers) and connect movement with sound.

The flex sensors are embedded in the gloves which measure the bend of the fingers, the IMU measures the orientation of the wrist. All the information received by these sensors is communicated over wifi. There is also a vibration motor implemented for haptic feedback. With the glove comes software that allows the user to program movements to coordinate with different sounds. Movements can be coordinated with MIDI and OSC.

I believe this piece of technology really pushes the boundaries of computational music as it allows musicians to have complete agency over electronically generated sounds through curated body movement without having to control sounds through a stationary piece of hardware. Performers, in particular, could benefit heavily from these gloves as their artistry moves beyond music and into how body language is incorporated with sound.  As a designer, I personally admire how she was able to use advanced technology to create these novel experiences not only for the performer but also for the audience. There are instances where the use of technology can hinder artistry especially when it is overused, but I think these gloves allow musicians to connect more with the music and how it’s being presented to the audience.