Crystal-Xue-Project-09

sketch-231.js

//Crystal Xue
//15104-section B
//luyaox@andrew.cmu.edu
//Project-09

var underlyingImage;
var xarray = [];
var yarray = [];

function preload() {
    var myImageURL = "https://i.imgur.com/Z0zPb5S.jpg?2";
    underlyingImage = loadImage(myImageURL);
}

function setup() {
    createCanvas(500, 500);
    background(0);
    underlyingImage.loadPixels();
    frameRate(20);
}

function draw() {
    var px = random(width);
    var py = random(height);
    var ix = constrain(floor(px), 0, width-1);
    var iy = constrain(floor(py), 0, height-1);
    var theColorAtLocationXY = underlyingImage.get(ix, iy);

    stroke(theColorAtLocationXY);
    strokeWeight(random(1,5));
    var size1 = random(5,15);
    //brush strokes from bottom left to top right diagnal direction
    line(px, py, px - size1, py + size1);

    var theColorAtTheMouse = underlyingImage.get(mouseX, mouseY);
    var size2 = random(1,8);
    for (var i = 0; i < xarray.length; i++) {
        stroke(theColorAtTheMouse);
        strokeWeight(random(1,5));
        //an array of brush strokes from top left to bottom right diagnal direction controlled by mouse
        line(xarray[i], yarray[i],xarray[i]-size2,yarray[i]-size2);
        size2 = size2 + 1;
        if (i > 10) {
            xarray.shift();
            yarray.shift();
        }
    }
}

function mouseMoved(){
    xarray.push(mouseX);
    yarray.push(mouseY);
}

phase-1
phase-2
phase-3
original picture

This is a weaving portrait of my friend Fallon. The color pixels will be concentrated on the cross between strokes of two directions

Siwei Xie – Looking Outwards – 10

Microscale is a generative and web-based album. I admire it because although the creator has written generative/algorithmic music before, and almost all of his previous work has procedurally generated material, microscale is his first fully generative album which was created from a “generative” idea. Creator’s artistic sensibilities manifest because this album has been created not so much by thinking, as by emotions – so it’s not purely artificial intelligence or computer music. 

The music on microscale is generated in real-time from random Wikipedia articles. Each article becomes a step sequencer, where the letters are the sequencer steps and the track titles are regular expressions that switch the steps of the sequencers on and off.

The concept of the album is to show that through transforming one media (text) into another media (music), the meaning can be transformed – the article has its own meaning, but the music has a completely different meaning. And it’s not just one-to-one transformation – there are six articles (i.e. six meanings), which although unrelated to each other, create a whole piece of music that has one singular meaning.

Ales Tsurko, Microscale, 2017

Link to original source.

Emma NM-LO-10

Sonic Playground in Atlanta

Sonic Playground (2018) – Yuri Suzuki Design

Sonic Playground was an outdoor sound installation in Atlanta, Georgia that features colorful sculptures that modify and transmit sound in an unusual but playful way. I admire how the art installation engages the community in an art experience and gives people the opportunity to explore how sound is constructed, altered, and experienced. I like that it is for all people, regardless of age. Anyone can enjoy it. The installation itself is not computational, but they used Rhinoceros 3D to create a raytracing tool that allows the user to choose certain aspects of the sounds path. Users could “select a sound source and send sound in a certain direction or towards a certain geometry, in this case the shape of the acoustic mirrors or the bells at the start and end of the pipes to see how the sound is reflected and what is the interaction with the object.”

The artist’s creativity comes out in the path and shapes he chose for the final sculptures, thus influencing the sound that comes out. He decided which sounds were more interesting and the path it takes to make that sound. 

Sonic Playground Installation
Raytracing using Rhinoceros 3D

Nadia Susanto – Looking Outwards – 10

The Computer Orchestra is an interactive orchestra consisting of multiple computers. It was created by fragment.in, and the goal was to let the user conduct their own orchestra and music. The conductor’s hand movements are accurately recognized using an Xbox Kinect motion controller that is connected to a central computer. Instructions are given to many musician screens. Screen-musicians then send the sound to the conductor and produces visual feedback.

What I love most about the Computer Orchestra is that it crowdsources sounds that people can upload, and then the musician can access it and play it. It’s incredible to see that one person can control the music through simple hand motions and gestures. The simple interface of the centralized computer also makes it extremely easy for the conductor to change where he wants vocals, violin, etc.

Video demonstrating a user setting up the Computer Orchestra and then conducting it with his hand motions.
Another demonstration of a conductor using the Computer Orchestra for a Ted Talk

To learn more about the Computer Orchestra, click the link below:

https://www.fragment.in/project/computer-orchestra/

Chelsea Fan-Looking Outward 10

Kraftwerk, an electronic band, created a The Robots electronic music performance in 2009. Kraftwerk was established by classical musicians who wanted to mix sound, feedback and rhythm to create music.

The video depicts electronic music with robots on stage moving along in set patterns to the music. I admire that it has a “concert” feel despite not having a singer. The performance includes music, lights, a stage, and people. Although I do wish that the robots moved to the beat of the music or maybe at a faster pace. The slow movements of the robots don’t match the upbeat fast-paced music.

I don’t know anything about the algorithms about how the work was generated. I also don’t want to suppose anything because I really have no clue and it would be wrong to generalize and guess based on no knowledge.

Kraftwerk, The Robots Electronic Music Performance (2009). Originally from 1978. From the album “The Man Machine” by Ralf Hütter, Florian Schneider, and Karl Bartos.

Sewon Park – LO – 10

“Music Thingy” playing in the outside studio

A computational music project that I found inspiring was the “Weather Thingy”by Filip Visnjic. The project was mainly composed of two parts: one being a weather station and other being a controller. The basic mechanics of the project was that it would gauge wind and rain levels with its sensors. And then, the controller had receptors that could translate such weather data into audio effects, after interpretation with built-in instruments. The controller also had screens where the artist can amplify or constrain sounds.

This project was inspiring that it used sounds from nature to recreate music. Ironically, Filip uses a computer software to interpret sounds such as rain, wind, and thunder. This project is incredible in that it gives musical artists various novel sounds effects to work with. Filip also gave the machine the ability to save certain sounds to later give musicians inspiration.

The “Weather Thingy” uses various software such as C++, Arduino, and MIDI protocol.

Minjae Jeong-project09

sketch

//Minjae Jeong
//Section B
//minjaej@andrew.cmu.edu
//Project-9


var underlyingImage;

function preload() {
    var myImageURL = "https://i.imgur.com/4WraCGa.jpg";
    underlyingImage = loadImage(myImageURL);
    underlyingImage.resize(480,480);
}

function setup() {
    createCanvas(480, 480);
    background(0);
    underlyingImage.loadPixels();
    frameRate(10);
}

function draw() {
    var px = random(width);
    var py = random(height);
    var ix = constrain(floor(px), 0, width-1);
    var iy = constrain(floor(py), 0, height-1);
    var theColorAtLocationXY = underlyingImage.get(ix, iy);

    noStroke();
    fill(theColorAtLocationXY);
    var size = random(10, 30); //random from 1-10
    ellipse(px, py, size, size); //draw circles!

    var theColorAtTheMouse = underlyingImage.get(mouseX, mouseY);
    stroke(theColorAtTheMouse);
    textSize = random(10, 30);
    text("SQUARE", mouseX, mouseY); //draw "squares"
}

For this project, I used a photo of my roommate. One issue I could not solve was to adjust the size of the picture according to the canvas size, which is currently 3024 x 4032. Because of it, I do not think canvas is showing the whole image properly, which is why there are so many browns on canvas.

Fallon Creech-Project-09-Portrait

sketch

//Fallon Creech
//Section B
//fcreech@andrew.cmu.edu
//Project-09

var underlyingImage;

function preload() {
    var myImageURL = "https://i.imgur.com/nmxNHkl.jpg?1";
    underlyingImage = loadImage(myImageURL);
}

function setup() {
    createCanvas(500, 500);
    background(0);
    underlyingImage.loadPixels();
    frameRate(10);
}

function draw() {
    var px = random(width);
    var py = random(height);
    var ix = constrain(floor(px), 0, width-1);
    var iy = constrain(floor(py), 0, height-1);
    var theColorAtLocationXY = underlyingImage.get(ix, iy);

    noStroke();
    fill(theColorAtLocationXY);
    textStyle(NORMAL);
    textSize(random(15, 25));
    text("sketch", px, py);

    var theColorAtTheMouse = underlyingImage.get(mouseX, mouseY);
    stroke(theColorAtTheMouse);
    textStyle(BOLDITALIC);
    textSize(random(25, 35));
    text("doodle", pmouseX, pmouseY);
}

For this project, I used a picture of a friend sketching in her sketchbook. I decided to differentiate between interactions of the animation by using different text styles and words; the animation randomly generates the word “sketch,” but the word “doodle” appears at the mouse’s location, giving the animation some visual complexity. 

after 30 seconds
after 2 minutes
original image

Project-09-Portrait

After looking through all the examples and code offered on the website, I decided I wanted to experiment with a number of designs. I decided I would do this by dividing up my screen and designating a type of design/pattern for each section. I was originally going to divide it up equally into three parts, but then I experimented a bit and decided to make it more interesting than that. I first decided to bring attention to my eyes by having the section over them be in HSB color value. Then I had there be two sections like this, one vertical and one horizontal, overlapping on one of my eyes, creating a cross section. This divided up the screen, and I filled up the sections this cross created with different patterns.

sketch

var myImage;
var topOfEyes = 100;//general height of where the top of my eyes are
var bottomOfEyes = 160;//general height of where the bottom of my eyes are
var leftOfEye = 260;//general left of where the left of my right eye is
var rightOfEye = 320;//general right of where the right of my right eye is

function preload() {
    var imgURL = "https://i.imgur.com/P9ng7Hd.jpg";
    myImage = loadImage(imgURL);//loads image
}

function setup() {
    createCanvas(480, 480);
    background(0);
    myImage.loadPixels();//loads pixels from image
    frameRate(1000);//faster framerate
}

function draw() {
    myImage.resize(620,480);//resizes image so my face fits onscreen
    var pixlX = random(width);//random pixel from width values
    var pixlY = random(height);//random pixel from height values
    var constrX = constrain(floor(pixlX), 0, width - 1);//constrains x value 
    //and keeps it at whole number
    var constrY = constrain(floor(pixlY), 0, height - 1);//constrains y value 
    //and keeps it at whole number
    var colorFromXY = myImage.get(constrX, constrY);//constrains color to image
    noStroke();

    push();
    colorMode(HSB,100);//changes color value to HSB
    fill(colorFromXY);//takes color from photo "below" it
    //ribbons
    ellipse(pixlX, random(topOfEyes,bottomOfEyes), 3, 3);//verticle red line
    ellipse(random(leftOfEye,rightOfEye),pixlY, 3, 3);//horizontal red line
    pop();

    fill(colorFromXY);//takes color from photo "below" it
    ellipse(pixlX,pixlY, 5, 5);//puts circles of portrait across screen
    //gradation of rectanbgles that slowly increases from 320 to width
    rect(random(rightOfEye,rightOfEye + 32),pixlY, .5,.5);
    rect(random(rightOfEye + 32,rightOfEye + (32 * 2)),pixlY, 1,1);
    rect(random(rightOfEye + (32 * 2),rightOfEye + (32 * 3)),pixlY, 1.5,1.5);
    rect(random(rightOfEye + (32 * 3),rightOfEye + (32 * 4)),pixlY, 2,2);
    rect(random(rightOfEye + (32 * 4),rightOfEye + (32 * 5)),pixlY, 2.5,2.5);
}

Portrait In Process
Final Product

Julia Nishizaki – Looking Outwards 09

For this week, I chose to look at Margot Gersing’s post on Zeitguised Studios, and specifically their project in 2014 titled “Birds.” Using 3D computer graphics, the now Berlin based studio is known for creating compelling narratives, quirky characters, and fun, playful projects.

“Birds” by Zeitguised Studios

On their website, Zeitguised describes their project, “Birds,” as a “lighthearted essay on contextualized characters.” Throughout the video, this work portrays representations of birds made only out of objects we associate with birds, like eggs, worms, or bird houses.

An image taken from the “Birds,” a project that represents birds without actually showing a bird

I decided to write about this project, as I’m not very familiar with 3D animation or how it can be utilized, and I was drawn to the very creative nature of the work’s concept, the beautiful graphics, the bright colors, and the fun animations.

In her post, Margot reflects on the playful choice to represent a bird out of everything except for the bird. I thought this was an interesting point, and I’m curious as to what the deeper meanings in this piece are, as far as what a bird actually is and how our relationship to birds shifts that definition.