Erin Fuller Project-04-String-Art


//Erin Fuller
//SectionA
//efuller@andrew.cmu.edu
//Project 04

// please view on safari, does not work on chrome

var x; //mouseX global
var y; //mosueY global

var control; // laser position based on for loop increments

function setup() {
    createCanvas(400, 300);
}

function draw() {
    background(0);

    var x = mouseX * (255 / width); // changes g value relative to width
    var y = mouseY * (255 / height); // changes b value relative to width

    stroke(255, x, y); //g and b values changed 
    strokeWeight(.005); // very thin lines to look like lasers

    for (var i = 0; i < 100; i++) {
        var control = (i * mouseX) / 75; //control increases based on mouseX postion
        
        line(0, 0, control, height);//top left "laser" pointing down
        line(0, 0, width, control);//top left "laser" pointing right
   
        line(width, height, control, 0);//bottom right "laser" fanning left and up
        line(0, control, width, height);//bottom right "laser" fanning left and down
    }
}

I wanted my “String Art” to be reminiscent of a laser light show like what you see at concerts. So as you move your mouse you change how wide the “lasers” fan out and their color.

Judy Li-Project-04-String-Art

judyli: String Project 04

/*
Judy Li
Section A
judyli@andrew.cmu.edu
Project-04
*/

function setup() {
    createCanvas(400, 300);
    background(218,175,32);
}

function draw() {
	var x = 0;
	var x1 = 0;
	var x2 = 0;
	var x3 = 0;
	var x4 = 0;
	var x5 = 0;
	var y2 = 0;
	var y3 = 0;
	var y4 = 0;
	var y5 = 0;

	for (var i = 0; i < 50; i++) {
		x += 10;
		stroke(255, 0, 0);
		line(300, 300, x, 0);
	}
	for (var e = 0; e < 50; e++) {
		x1 += 10;
		stroke(0,255,255);
		line(300, 0, x1, 300);
	}
	for (var c = 0; c < 1; c += 0.1) {
		x2 = lerp(300, 400, c);
		y2 = lerp(150, 0, c);
		stroke(0, 0, 255);
		line(400, 150, x2, y3);
	}
	for (var d = 0; d < 1; d += 0.1) {
		x3 = lerp(300, 400, d);
		y3 = lerp(300, 300, d);
		stroke(255, 0, 255);
		line(400, 150, x3, y3);
	}
    for (var h = 0; h < 30; h++) {
        x4 += 10;
        y4 += 10;
        stroke(255, 255, 255);
        line(0, y4, x4, 300);
    }
    for (var s = 0; s < 30; s++) {
        x5 += 10;
        y5 += 10;
        stroke(100, 100, 100);
        line(300 - x5, 0, 0, y5);
    }
	for (var f = 0; f < 6; f++) {
		ellipse(450 - (f * 75), 150, 20 * (f / 1.5), 20 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 18 * (f / 1.5), 18 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 16 * (f / 1.5), 16 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 14 * (f / 1.5), 14 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 12 * (f / 1.5), 12 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 10 * (f / 1.5), 10 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 8 * (f / 1.5), 8 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 6 * (f / 1.5), 6 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 4 * (f / 1.5), 4 * (f / 1.5));
		ellipse(450 - (f * 75), 150, 2 * (f / 1.5), 2 * (f / 1.5)); 
		stroke(0);
		noFill();
	}
	strokeWeight(0.5);
}

This project was a super fun one because it was easier for me to take control of the direction of my lines/curves. The only thing I had a little trouble with was the ‘lerp’ command. I had to test around with the x and y values so that I was able to get a sense of what I had to tweak to get a specific string pattern.

Looking Outwards – 04

Using textiles as electroacoustic transducers Author: Filip Visnjic

I thought this work was really interesting because I’ve see clothing that incorporates lighting, but not sound. This is a project by Esteban and Judit of EJTECH. The reason why they wanted to create pieces like this was to enhance and explore the possibilities of multi-sensory experiences through textiles. This soft sound acts as a provocative new instrument for human expression. The main piece, would be a metal surface that emits audio and sonic vibrations because it is embedded onto the fabric. The intent of this project was that it was to be used as an innovation to material. And because of this, there can be so many different possibilities to implement this application.

Soft Sounds – Registry Phase 1

Project Page – Prototypes and Different Iterations

Curran Zhang-LookingOutwards-4

This article begins to talk about the merge of music and technology. With the idea of robots being able to do jobs that humans can, many people begin to wonder whether robots and AI would have to ability to create music without the need of humans. Francois Pachet, head of SONY’s computer science lab in Paris, believed that they were very close to programming computers into creating melodies through the mashing of music’s produced by Legrand and McCartney. According to scientist, composer, and author, David Cope, music contains instructions that can be synthesized into different yet similar outputs. Cope also designed EMMY, an emulator, that creates music that are similar to Bach chorale, Mozart sonata, Chopin mazurka and Joplin Rag.

This ongoing process captured my attention since people within this field has a strong desire to link machine to music. In this era, the idea of multidisciplinary designs and work are applied in every field. Yet art, especially music, and machine are one of the harder combinations due to its different workflow. By trying to merge the two fields together, both fields of work would have layers of new discovery and understanding. Music that makes people cry, happy, and emotional would no longer only be achieved by people, but also by machines.

 

Links

https://www.theatlantic.com/entertainment/archive/2014/08/computers-that-compose/374916/

Erin Fuller-LookingOutwards-04

The project “Green Music”, by John Lifton, a London based artist, was part of the documentary called “The Secret Life of Plants” (1979). Lifton produced music based on the bio-electronic sensing of plants to record the “stress” of their physical environment, such as light, temperature, the presence of guests, etc. In this project, the computers are constantly receiving information from the sensors attached to the plants, and converting the data into music. In this work it makes both the plants and humans creators in that they are both acting on each other to produce this sound; although there is no tangible interaction with the artwork, guests presence alone can be enough for the plants to react and create different music. I think that’s beautiful.

    Documentary Clip of “Green Music”, 1979

I think this project is admirable just because it was created so long ago. It is easy to think the computational design is something of this decade or even just this millennium, but this project has shown me that people have been working on and progressing the field of computational design for much longer than I previously thought.

Yingyang Zhou-LookingOutwards-4

Ryoji Ikeda
born in 1966 in Gifu, Japan
lives and works in Paris, France and Kyoto, Japan

Japan’s leading electronic composer and visual artist Ryoji Ikeda focuses on the essential characteristics of sound itself and that of visuals as light by means of both mathematical precision and mathematical aesthetics. Ikeda has gained a reputation as one of the few international artists working convincingly across both visual and sonic media. He elaborately orchestrates sound, visuals, materials, physical phenomena and mathematical notions into immersive live performances and installations.
Alongside of pure musical activity, Ikeda has been working on long-term projects through live performances, installations, books and CD’s such as ‘datamatics’ (2006-), ‘test pattern’ (2008-), ‘spectra’ (2001-), ‘cyclo.’ a collaborative project with Carsten Nicolai, ‘superposition’ (2012-), ‘supersymmetry’ (2014-) and ‘micro | macro’ (2015-).
//Ryoji Ikeda website

‘superposition’

superposition is a project about the way we understand the reality of nature on an atomic scale and is inspired by the mathematical notions of quantum mechanics. Performers will appear in Ikeda’s work for the first time, performing as operator/conductor/observer/examiners. All the components on stage will be in a state of superposition; sound, visuals, physical phenomena, mathematical concepts, human behaviour and randomness – these will be constantly orchestrated and de-orchestrated simultaneously in a single performance piece.

I like the projects of Ryoji Ikeda because it origins from matchmatic element and by extending it to the realm of philosophy , it shows what audiovisual can do to inspire people.

other works of Ryoji like supersymmetry presents an artistic vision of the reality of nature through an immersive and sensory experience.
This project is a series of work conceived as installation versions of the performance work “superposition” (2012-) and as a platform to update the process and outcome of a residency during 2014-15 at CERN in Geneva, the largest centre in the world for particle physics.

Jonathan Liang – Looking Outwards – 04

                              the sound of music

Meandering River is a collaboration between Funkhaus Berlin and onformative that saught to represent the fast-moving world through gradual, rhythmic movements rather than a snapshot. onformative used a custom-written algorithm that reinterprets fluctuating river patterns based on the sounds that are generated form the river. They take this data to then generate a colorful river landscapes (that are changing real-time) and project them onto screens. This type of data visualization is common amongst onformative’s works, which can be found here :

https://onformative.com/work

Another onformative project that can best exemplifies sound art is its project titled Porsche Blackbox. In this project onformative takes the sounds from a blackbox in a Porsche and uses that data to visualize what driving the vehicle was like at that time. Their work has really inspired me to explore what artists can do with data like sound, or even other senses that are not sight related.

More on the Porsche Blackbox project is in this link below:

https://onformative.com/work/porsche-blackbox

Project 3 rrandell

sketch

/* Rani Randell
Section A
rrandell@andrew.cmu.edu
Project 03 */

var backcolor;
var x = 0;
var y = 0;

function setup() {
    createCanvas(400, 400);
    
}

function draw() {

	var R = mouseX;
    var G = mouseY;
    var B = mouseX;
    backcolor = color(R, G, B);
	background(backcolor); //make the background change randomly with the mouse movement
	
	noStroke(0);
	fill(161, 0, 0);
	rect(mouseX, 0, 10, 400);
	
	fill(161, 161, 0);
	rect(mouseX + 40, 0, 10, 400);
	
	fill(7, 136, 70);
	rect(mouseX + 80, 0, 10, 400);
	
	fill(162, 82, 3);
	rect(mouseX + 120, 0, 10, 400);
	
	fill(0, 33, 203);
	rect(mouseX - 40, 0, 10, 400);
	
	fill(153, 0, 77);
	rect(0, mouseY, 400, 10);
	
	fill(101, 130, 0);
	rect(0, mouseY + 40, 400, 10);
	
	fill(65, 102, 0);
	rect(0, mouseY + 80, 400, 10);
	
	fill(68, 10, 127);
	rect(0, mouseY + 160, 400, 10);



	}


For this project I really wanted to experiment with color when the mouse is moved around. I made a small optical illusion with both line and color. I was really inspired by Mondrian’s clean lines and geometry for this project.

Jamie Dorst Looking Outward 04

For this week’s Looking Outward post, I chose to write about The Classyfier. The Classyfier is a table that recognizes sounds from beverages (the clink of a wine glass, a can opening, a spoon stirring tea, etc.) and automatically plays appropriate music.

The Classyfier–a sound detecting music automator.

I admire this project because I think it’s a really creative use of AI. Even though I don’t think this is the most urgent or influential project, I think it’s cool that we can use this technology to create more projects in the future; A smart music playing table might not change the world, but we can use this technology to make something that will. In the article linked above, it says that they used Wekinator, Processing, and the OFX Collection. I also found it cool that they used processing because that’s similar to p5.js. It’s inspiring that projects like this are being made in a language very similar to the one that we’re learning, meaning that we are that much closer to creating this type of stuff ourselves.

Video demonstrating The Classyfier in action

Ean Grady-Looking Outwards-03

http://www.michael-hansmeyer.com/digital-grotesque-II

Digital Grotesque II (2017) is a full-scale 3D printed grotto (cave), designed by famous computational architect Michael Hansmeyer. It is interesting to consider the medium that computational fabrications such as this exist in because they are intricate and naturally built but also man-made. I admire the extreme amount of detailing in the caves that the computer generated, also it shows the full possibilities of computational architecture and that we no longer have to look at a building and consider the manpower or human-skill it takes to physically sculpt/design building art. The article states  the idea that, “while we can fabricate anything, design arguably appears confined to our instruments of design: we can only design what we can directly represent. “, which is interesting because although fabrication offers a wealth of new possibilities, it is limited due to its nature of control and execution, therefore the article poses the idea that we need new tools of design.

Video below shows the interior of the computationally fabricated grotto.