LookingOutwards-10

The Stanford Mobile Phone Orchestra (MoPhO) is an innovative ensemble that explores social music-making using mobile devices, instantiated in 2007, directed by Ge Wang.

images of MoPho

This project takes advantage of existing technologies and turns them into personal musical instruments. People on the internet could also add sound to existing music.

I admire how this project affords new possibilities for us to be expressive in the field of music creation. In the past, only those who received special training could play certain instruments, but with MoPhO, anyone interested in music could easily express themselves musically with available devices. For example, one of the instrument it could mimic is the ancient flute. By simply blowing into the microphone, the sensor embedded in the phone could detect the strength of your sound and synthesizing the sound according to where the fingers are pressing on the screen interface. This changes the way humans think about making music and the approach they use.

Looking Outwards 10

Since my LO-4 assignment focused on a form of sound art, I chose the option to investigate a piece of music that you could hear in a concert hall. Which in this case, actually took place in one. I was drawn to this project because it wasn’t just auditory, but also a beautiful visual experience as well. The Stanford Laptop Orchestra used SLork musical lanterns that communicate with ChucK via wifi to translate the movements of the lanterns into sound and light.

The musical performance was called “Aura” since they used multiple people with different light color and sounds that harmonized with one another in a similar way that people do in society. I thought it was a very clever and interesting way to create music through movement.

LO-10 Computer Sound

Star Wars Blaster Sound Effect

Ben Burtt, who did sound design for the original star wars movies, also made the class blaster sound effect. A combination of digital & analog techniques, it has been remade and remixed countless times over the years as the franchise is adapted and updated.

As demonstrated in the video, the sound was originally recorded using steel cable under tension (or a slinky, in this case) & a wrench. The classic PEW PEW was then saved, isolated from background noise, layered with other sounds, and edited into the movie. In the same way that Star Wars has defined what a good sci-fantasy movie SHOULD be, its iconic blaster sound has also cemented into our cultural consciousness what a laser gun SHOULD sound like. That iconic sound would not be possible without a combination of analog & digital processes.

-Robert

Project-10 Sound Story

This is a story about a duck, a duckling, a cloud, and lightning bolt.

sketch
//Robert Rice
//rdrice
//Section C


// sketch.js template for sound and DOM
//
// This is the 15104 Version 1 template for sound and Dom.
// This template prompts the user to click on the web page
// when it is first loaded.
// The function useSound() must be called in setup() if you
// use sound functions.
// The function soundSetup() is called when it is safe
// to call sound functions, so put sound initialization there.
// (But loadSound() should still be called in preload().)

var mama = {filename:'https://i.imgur.com/z44s88k.png', //https://images.dailykos.com/images/214263/story_image/Duck-37.png?1456291242
            x:0,
            y:0,
            playFunc: playOsc,
            stopFunc: stopOsc,
            drawFunc: drawImg}
var duckling = {filename:'https://i.imgur.com/X5iYcio.png', //https://purepng.com/public/uploads/large/91508076238ploll99zx4ifi35p6b1qrontiecfaivclrqbiz0gfg0rru6qtj7qmlw2qmvrthjbk3sj2wgiwa12pz4n00nufufllybyth2akpcx.png
            x:0,
            y:0,
            playFunc: playOsc,
            stopFunc: stopOsc,
            drawFunc: drawImg}
var cloud = {filename:'https://i.imgur.com/igVfind.png', //https://clipground.com/images/clipart-cloud-png-10.png
            x:-50,
            y:100,
            playFunc: playOsc,
            stopFunc: stopOsc,
            drawFunc: drawImg}
var lightning = {filename:'https://i.imgur.com/9RODxMu.png', //https://asr4u.files.wordpress.com/2013/06/lightning-bolt-hi1.png
            x:150,
            y:150,
            playFunc: playOsc,
            stopFunc: stopOsc,
            drawFunc: drawImg}
var tScale = 1; //used later for scaling stuff down. 1 == 100%


function preload() {
    // call loadImage() and loadSound() for all media files here

    mama.image = loadImage(mama.filename);
    duckling.image = loadImage(duckling.filename);
    cloud.image = loadImage(cloud.filename);
    lightning.image = loadImage(lightning.filename);
    //loadSound();
}


function setup() {
    // you can change the next 2 lines:
    createCanvas(300, 300);
    createDiv("p5.dom.js library is loaded.");
    frameRate(30);
    imageMode(CENTER);
    //======== call the following to use sound =========
    useSound();
}


function soundSetup() { // setup for audio generation
    // you can replace any of this with your own audio code:
    mama.osc = new p5.Oscillator();
    mama.trem = new p5.Oscillator();    //mama duck's voice
    mama.trem.freq(10);
    mama.osc.setType('sawtooth');
    mama.osc.freq(midiToFreq(60));
    mama.osc.amp(mama.trem);

    duckling.osc = new p5.Oscillator();
    duckling.trem = new p5.Oscillator();    //baby duck's voice
    duckling.trem.freq(30);
    duckling.osc.setType('sawtooth');
    duckling.osc.freq(midiToFreq(70));
    duckling.osc.amp(mama.trem);

    cloud.osc = new p5.Oscillator();
    cloud.trem = new p5.Oscillator();   //makes cloud go brrrrrrr
    cloud.trem.freq(10);
    cloud.osc.setType('sawtooth');
    cloud.osc.freq(midiToFreq(31));
    cloud.osc.amp(cloud.trem);

    lightning.osc = new p5.Oscillator();    //lightning sound
    lightning.trem = new p5.Oscillator();  //makes it go pew pew
    lightning.trem.freq(10000);
    lightning.osc.setType('square');
    lightning.osc.amp(lightning.trem);
    lightning.osc.freq(midiToFreq(90));
}


function draw() {
    // you can replace any of this with your own code:
    background(200);    
    
    if (frameCount >= 0 & frameCount <= 150) {     //act I the status quo
        mama.x = 50;
        mama.y = 250;
        mama.drawFunc(100, 100);

        duckling.x = 100;
        duckling.y = 275;
        duckling.drawFunc(40, 50);

        if (frameCount == 30) {mama.playFunc();
            mama.drawFunc(200, 200);}
        if (frameCount == 50) {mama.stopFunc();}

        if (frameCount == 60) {duckling.playFunc();
            duckling.drawFunc(80, 100);}
        if (frameCount == 70) {duckling.stopFunc();}

        if (frameCount == 90) {mama.playFunc();
            mama.drawFunc(200, 200);}
        if (frameCount == 150) {mama.stopFunc();}

        if (frameCount == 120) {duckling.playFunc();}
        if (frameCount > 120 & frameCount < 150) {duckling.drawFunc(200, 200);}
        if (frameCount == 150) {duckling.stopFunc();}
    }

    if (frameCount >= 150 & frameCount <= 300) {   //act II a cloud arrives
        mama.drawFunc(100, 100);
        duckling.drawFunc(40, 50);

        var cDX = 2 //the speed at which the cloud will move across the screen


        cloud.drawFunc(100, 50);

        cloud.x += cDX;
        if (cloud.x > 150) {cloud.x = 150;} //will move across the screen, before settling in the middle

        if (frameCount == 250) {cloud.playFunc();}
        if (frameCount > 250 & frameCount < 300) {cloud.drawFunc(300, 150);}
        if (frameCount == 300) {cloud.stopFunc();}
    }

    if (frameCount >= 300 & frameCount <= 450) {   //act III the cloud brings forth lightning
        mama.drawFunc(100, 100);
        duckling.drawFunc(40, 50);
        cloud.drawFunc(300, 150);

        if (frameCount == 325) {lightning.playFunc(); lightning.drawFunc(100, 100);}
        if (frameCount == 330) {lightning.stopFunc(); lightning.drawFunc(50, 50);}

        if (frameCount == 355) {lightning.playFunc(); lightning.drawFunc(100, 100);}
        if (frameCount == 360) {lightning.stopFunc(); lightning.drawFunc(50, 50);}

        if (frameCount == 385) {lightning.playFunc(); lightning.drawFunc(100, 100);}
        if (frameCount == 390) {lightning.stopFunc(); lightning.drawFunc(50, 50);}

        if (frameCount == 415) {lightning.playFunc(); lightning.drawFunc(100, 100);}
        if (frameCount == 420) {lightning.stopFunc(); lightning.drawFunc(50, 50);}
    }

    if (frameCount >= 450 & frameCount <= 600) {   //act IV mama duck defends her child
        mama.drawFunc(100, 100);
        duckling.drawFunc(40, 50);
        cloud.drawFunc(200, 100);

        if (frameCount == 510) {mama.playFunc();}
        if (frameCount > 510 & frameCount < 600){
            mama.drawFunc(300, 300);
            mama.x += random(-10, 10);
            mama.y += random(-10, 10);
        }
        if (frameCount == 600) {mama.stopFunc(); mama.x = 50; mama.y = 250;}
    }

    if (frameCount >= 600 & frameCount <= 750) {   //act V the attackers rejected
        if (frameCount == 600) {
            lightning.x = 250
        }

        mama.drawFunc(100, 100);
        duckling.drawFunc(40, 50);

        push();
        scale(tScale, tScale);
        cloud.drawFunc(200, 100);
        lightning.drawFunc(75, 75);
        pop();

        tScale = tScale * 0.95
    }

    if (frameCount >= 750 & frameCount <= 900) {   //act VI return to status quo
        mama.x = 50;
        mama.y = 250;
        mama.drawFunc(100, 100);

        duckling.x = 100;
        duckling.y = 275;
        duckling.drawFunc(40, 50);

        if (frameCount == 780) {mama.playFunc();
            mama.drawFunc(200, 200);}
        if (frameCount == 800) {mama.stopFunc();}

        if (frameCount == 810) {duckling.playFunc();
            duckling.drawFunc(80, 100);}
        if (frameCount == 820) {duckling.stopFunc();}

        if (frameCount == 840) {mama.playFunc();
            mama.drawFunc(200, 200);}
        if (frameCount == 900) {mama.stopFunc();}

        if (frameCount == 870) {duckling.playFunc();}
        if (frameCount > 870 & frameCount < 900) {duckling.drawFunc(200, 200);}
        if (frameCount == 900) {duckling.stopFunc();}
    }
}

function playOsc() {
    this.trem.start();
    this.osc.start();//plays the sound
}

function stopOsc() {
    this.osc.stop();
    this.trem.stop();//stops the sound
}

function drawImg(w, h) {    //draws the picture at the specified scale
    image(this.image, this.x, this.y, w, h);
}

Looking Outwards 10 : Computer Music

Charli XCX in performance

Charlie XCX, also known as Charlotte Emma Aitchison, is a professional singer, songwriter, music video director, and record producer. She was born in Cambridge and her music focuses on the musical styles of gothic pop, synth-pop, dance-pop, electropop, pop-punk, and alternative pop. During her early career, her music possessed a mix of darkness and witch-house styles. Most of her songs contain a technical or computational aspect to it and her work remains very consistent. I admire how consistent Charlie XCX has been with her musical styles. Her work presents a clear idea of how passionate and interested she is in computational music. Even in the music industry, technology seems to have a very powerful role. I am curious about what is to come in the future as technology continues to advance, and how that advancement would impact its role in many fields.

Looking Outwards-10

The project I am discussing this week is called “Weather Thingy–Real Time Climate Sound Controller.” Weather Thingy is created by Adrien Kaeser and is a sound controller that uses real time climate related events to control the settings of musical instruments. The device has two main parts–a weather station connected on a tripod microphone and a controller connected to the weather station. This machine has three climate sensors which includes a rain gauge, wind vane, anemometer. The interface of the machine displays the date from the 4 different sensors. This project is super interesting to me because it took a real life issue at hand and translated it into sound. I liked how the music can change based on the data it collects and it was really nice to see how the creator was able to take a musical approach towards this topic.

Project 10 Sonic Story

Story: it rains (rain sound plays), and the sprout grows (grow sound plays) and bloom the flower (bloom sound plays). Then, the cloud clears up and the bird goes by (bird sound plays). At the end, the sun gets bigger (ending sound plays).

sketch

//Jae Son
//Section C
//story: it rains, and the sprout grows and bloom the flower. 
// Then, the cloud clears up and the bird goes by. 
// At the end, the sun gets bigger

var rain;
var grow;
var bloom;
var bird;
var sun;
var sprout;
var flower;
var birdimg;
var sunimg;

function preload() {
  //sounds
  rain = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/rain.wav");
  grow = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/grow.wav");
  bloom = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/bloom.mp3");
  bird = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/bird.wav");
  sun = loadSound("https://courses.ideate.cmu.edu/15-104/f2020/wp-content/uploads/2020/11/sun.wav");
  //images
  sprout = loadImage("https://i.imgur.com/jhYJcR1.png");
  flower = loadImage("https://i.imgur.com/o6nzV38.png");
  birdimg = loadImage("https://i.imgur.com/amZXis1.png");
  sunimg = loadImage("https://i.imgur.com/2W1rzB0.png");
}

function setup() {
  createCanvas(600, 400);
  useSound();
  frameRate(20); 
}

function soundSetup() { // setup for audio generation
  rain.setVolume(0.6);
}

function draw() {
  //blue background
  background(189,209,255); 
  noStroke();
  imageMode(CENTER);
  
  //animation
    if (frameCount < 5) {
        cloud(200,100);
        image(sprout,width/2,height-20,67,104);
    } else if (frameCount >= 5 & frameCount <10) {
        raindrop(250,150,0);
        cloud(200,100);
        image(sprout,width/2,height-20,67,104);
    } else if (frameCount >= 10 & frameCount <15) {
        raindrop(250,150,100);
        cloud(200,100);
        image(sprout,width/2,height-20,67,104);
    } else if (frameCount >= 15 & frameCount <50) {
        raindrop(250,150+frameCount*3,255);
        cloud(200,100);
        image(sprout,width/2,height-20,67,104);
    } else if (frameCount >= 50 & frameCount < 100) {
        cloud(200,100);
        image(sprout,width/2,height-50-frameCount/3,67,104);
    } else if (frameCount >= 100 & frameCount < 110){
        cloud(200,100);
        image(flower,width/2,height-80,67,104);
    } else if (frameCount >= 110 & frameCount < 270){
        image(sunimg,width/2,100,90,90);
        cloud(400-frameCount*2,100);
        image(flower,width/2,height-80,67,104);
        image(birdimg,-200+frameCount*3,200,84,57);
    } else if (frameCount >=270 & frameCount <300){
        image(sunimg,width/2,100,70+frameCount/5,70+frameCount/5)
        image(flower,width/2,height-80,67,104);
    } else {
        image(sunimg,width/2,100,130,130);
        image(flower,width/2,height-80,67,104);
    }
    
  //brown ground
    fill(165,85,85);
    rect(0,height-40,600,40);
    
  //sound play
    if (frameCount == 2) {
      rain.play();
    } else if (frameCount == 50) {
      grow.play();
    } else if (frameCount == 102) {
      bloom.play();
    } else if (frameCount == 110) {
      bird.play();
    } else if (frameCount == 250) {
      bird.stop();
    } else if (frameCount == 270) {
      sun.play();
    }
    
}

function cloud(x,y) { //cloud shape draw
  push();
  translate(x,y);
  noStroke();
  fill(235,242,255);
  ellipse(0,0,100);
  ellipse(87,0,115);
  ellipse(160,0,95);
  pop();
}

function raindrop(x,y,t) { //rain drops shape draw
  push();
  rectMode(CENTER);
  translate(x,y);
  noStroke();
  fill(100,178,255,t);
  rect(0,10,10,50);
  rect(40,0,10,50);
  rect(80,15,10,50);
  pop();
}


Looking Outwards 10

Jae Son
Section C

Looking Outwards 10: Computer Music

For this LO-10, I looked at Laetitia Sonami’s Magnetic Memories. She created this new instrument “Spring Spyre,” with Rebecca Fiebrink’s neural networks. According to the Stanford University’s CCRMA stage brochure, in which she performed, “the audio signals from three pickups attached to springs are fed to the neural networks, which are trained to control the live audio synthesis in MAXMSP.” So, with the performer’s real-time performance, somewhat chaotic set of sound is produced. I admire how a random, real-time physical performance produces sound that sounds chaotic but is within the programmed pattern. I like the intersection of installation, performance art, and computer art come together.

LO – 10

I came across the work of Robert Henke and found his installation, music, and software development work interesting especially with more knowledge of computer generated sounds from this weeks’ lectures. One track I wanted to highlight is “Gobi: The Long Edit” released this spring, a remastered edit of a previous track. What I find very impressive with this track is how organic it feels and the complexity of the layering of frequencies. It’s clear that Henke used a variety of modulators to adjust the frequency and amplitude of the waveforms. The result is an experience that is comparable to the sounds of creatures, vibrations, and ambience of a rainforest. To create something that feels organic and “real” with synthetic, digital means must be a difficult process. It makes me wonder how authentic soundtracks are in movies and TV. To what extent are they manipulated to resemble real-life? Even though I find this track comparable to nature, there is no doubt Henke incorporates his personal artistic vision into this sound art. This is especially apparent in his other tracks like the ones in his album “Archaeopteryx.” These pieces feel more distinctly “electronic” with clear examples of techniques like reverb, delay, and different-shaped waves.

Project- 09- Portrait

sketchDownload

//Shruti Prasanth
//Project 9- Portrait
//Section C

function preload() {
    var portrait="https://i.imgur.com/CDcRoOA.jpeg"; // image by Aliena85
    photo=loadImage(portrait);    //image variable
}

function setup() {
  createCanvas(300,400);
  background(0);
  photo.resize(300,400);    //resizing the image to fit the canvas
  photo.loadPixels();
  frameRate(10000000000000000000000*100000000000000000);
}

function draw() {
  var x=random(width);
  var y=random(height);
  var pixelx=constrain(floor(x),0,width);
  var pixely=constrain(floor(y),0,height);
  var pixelcolor=photo.get(pixelx,pixely);    //calling each pixel color
  noStroke();
  fill(pixelcolor);    //colors of the squares that appear

  square(x,y,random(0,5));    // squares that form the image


}

My inspiration for this portait was this painting of a girl submitted by Aliena85 on imagur.com. I thought the strokes that created the background and the sunlight filtering through the blinds would be interesting if they appeared in a stippling colored pixel form. It feels magical as the image starts to form because the specs look like colored dust, and gives the portrait of the girl a glowy quality.