04-Blog Post

Sonumbra was made as an interactable space in 2006 by Loop.pH, and utilized the movements of the audience. This entire project not only was so beautiful to look at, but also so admirable in how they treated the entire aesthetic of the place, taking into account the multiple variables of a setting, including sound, placement, sizing, and lighting. I suppose the algorithms took in the movements of the visitors and paired them with the fiber lights that made up the structure as well as computer-generated sound. The creator did an excellent job creating beauty in the structure that could have otherwise been rather chaotic or ugly-looking.

Sonumbra de Vincy, Responsive Light Emitting Environment 2008

Link here

Project-04: String Art

sketchDownload
var idx1;
var idx2;
var idy1;
var idy2;
var iNumLines=60;

var ox;
var oy;
var ox2;
var oy2;



function setup() {
    createCanvas(400, 300);
    background(0);
    idx1=100/iNumLines;
    idx2=200/iNumLines;
    idy1=100/iNumLines;
    idy2=200/iNumLines;


    r=70;
    r2=0.3;

    ox=width/2;
    oy=height/2;
    ox2=width;
    oy2=height;

}

function draw() {
    fill(200,0,0)
    
    //lightening
    var ix1=150;
    var ix2=300;
    var iy1=130;
    var iy2=80;
    strokeWeight(1);
    for (var i=0;i<=iNumLines;i+=1){
        stroke(255,20,147); //deep pink
        line(ix1+=idx1,iy1-=idy1,ix2-=idx2,iy2+=idy2);
    }
    noLoop();

    //vertical hole
    //loop over 2pi and the origin circulates.
    for (var t=0;t<=360;t+=4){
        strokeWeight(2.5);
        stroke(255,250,205); //lemon
        line(-30+ox+ox2*r2*Math.cos(radians(t)), 170+oy2*r2*Math.sin(radians(t))-5,
            -30+ox+ox2*r2*Math.cos(radians(t)),150+oy2*r2*Math.sin(radians(t))+5);
    }

    noLoop();
    //horizontal hole
    //loop over 2pi and the origin circulates.
    for (var t=0;t<=360;t+=4){
        stroke(0,206,209); // turquoise
        line(30+ox+5+ox2*r2*Math.cos(radians(t))-5, 170+oy2*r2*Math.sin(radians(t)),
            30+ox+5+ox2*r2*Math.cos(radians(t))+5,150+oy2*r2*Math.sin(radians(t)));
    }

    noLoop();

   
   //middle circle
    strokeWeight(1);
    //loop over 2pi
    for (var theta=0;theta<=360;theta+=10){
        //change origin 10 times
        for (var change=0; change<=10;change+=10){
            stroke(255);
            line(ox+change+r*Math.cos(radians(theta)),oy+r*Math.sin(radians(theta)),
                ox+change-r*Math.cos(radians(theta)),oy-r*Math.sin(radians(theta)));
        }
    }
    noLoop();
    
}

Looking Outwards 04: Sound Art

Sugarcube is a generative art project by Amanda Ghassaei that uses MIDI and MaxMSP. The project uses real life movement and generates sounds. I suppose that the project uses hardware to track the movement of MIDI and generates a certain pattern of sound using algorithm. The creator puts together her artistic sensibility with an existing generative nature of MIDI pad to create a great generative piece. The project also has the same hardware as normal MIDI pads which makes the project very practical. It is also impressive that the MIDI pad connects to MaxMSP, the program and lets you control sound on your computer.

https://vimeo.com/91259876

Read more

Project 4: String Art

Good luck four leaf clover with abstract land and sky

sketch
// Ana Furtado 
// Section E
// Project 4 -- String Art

//Q1
var dx1;
var dy1;
var dx2;
var dy2;
var numLines = 40;
//Q2
var bx1;
var by1;
var bx2;
var by2;
//Q3
var ax1;
var ay1;
var ax2;
var ay2;
//Q4
var cx1;
var cy1;
var cx2;
var cy2;
//Right
var ex1;
var ey1;
var ex2;
var ey2;
//Left
var fx1;
var fy1;
var fx2;
var fy2;

function setup() {
    createCanvas(400, 300);
    background(255); // white background with abstract land and sky
    strokeWeight(2);
    

    //Sky
    //Q1 lines
    stroke(179,243,255); //light blue
    line(0, 50, 350, 0);
    line(0, 250, 50, 0);
    dx1 = (350-0)/numLines;
    dy1 = (0-50)/numLines;
    dx2 = (50-0)/numLines;
    dy2 = (0-250)/numLines;

    //Q2 lines
    stroke(179,243,255); //light blue
    line(350, 0, 400, 250);
    line(50, 0, 400, 50);
    bx1 = (400-350)/numLines;
    by1 = (250-0)/numLines
    bx2 = (400-50)/numLines
    by2 = (50-0)/numLines
    
    //Land
    //Q3 lines
    stroke(192,255,135); //light green
    line(350, 300, 400, 50);
    line(50, 300, 400, 250);
    ax1 = (400-350)/numLines;
    ay1 = (50-300)/numLines
    ax2 = (400-50)/numLines
    ay2 = (250-300)/numLines

    //Q4 lines
    stroke(192,255,135); //light green
    line(350, 300, 0, 250); 
    line(50, 300, 0, 50);
    cx1 = (0-350)/numLines;
    cy1 = (250-300)/numLines
    cx2 = (0-50)/numLines
    cy2 = (50-300)/numLines


    //Center (4 leaf clover)
    //Right
    strokeWeight(1.25);
    stroke(92,255,92); //green
    line(210, 75, 190, 225); 
    line(275, 140, 125, 160); 
    ex1 = (190-210)/numLines;
    ey1 = (225-75)/numLines
    ex2 = (125-275)/numLines
    ey2 = (160-140)/numLines

    //Left
    stroke(92,255,92); //green
    line(210, 225, 190, 75); 
    line(275, 160, 125, 140); 
    fx1 = (190-210)/numLines;
    fy1 = (75-225)/numLines
    fx2 = (125-275)/numLines
    fy2 = (140-160)/numLines

}

function draw() {
    //Sky
    //Q1 -- upper left sky
    strokeWeight(1);
    var x1 = 0;
    var y1 = 50;
    var x2 = 0;
    var y2 = 250;
    for (var i = 0; i <= numLines; i += 1) {
        stroke(179,243,255); //light blue
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    }

    //Q2 -- upper right sky
    var x1 = 350;
    var y1 = 0;
    var x2 = 50;
    var y2 = 0;
    for (var i = 0; i <= numLines; i += 1) {
        stroke(179,243,255); //light blue
        line(x1, y1, x2, y2);
        x1 += bx1;
        y1 += by1;
        x2 += bx2;
        y2 += by2;
    }

    //Land
    // Q3 -- bottom right land
    var x1 = 350;
    var y1 = 300;
    var x2 = 50;
    var y2 = 300;
    for (var i = 0; i <= numLines; i += 1) {
        stroke(192,255,135); //light green
        line(x1, y1, x2, y2);
        x1 += ax1;
        y1 += ay1;
        x2 += ax2;
        y2 += ay2;
    }

    //Q4 -- bottom left land
    var x1 = 350;
    var y1 = 300;
    var x2 = 50;
    var y2 = 300;
    for (var i = 0; i <= numLines; i += 1) {
        stroke(192,255,135); //light green
        line(x1, y1, x2, y2);
        x1 += cx1;
        y1 += cy1;
        x2 += cx2;
        y2 += cy2;
    }

    //Green Center (4 leaf clover)
    //Right
    strokeWeight(1.25);
    var x1 = 210;
    var y1 = 75;
    var x2 = 275;
    var y2 = 140;
    for (var i = 0; i <= numLines; i += 1) {
        stroke(92,255,92); //green
        line(x1, y1, x2, y2);
        x1 += ex1;
        y1 += ey1;
        x2 += ex2;
        y2 += ey2;
    }

    //Left
    var x1 = 210;
    var y1 = 225;
    var x2 = 275;
    var y2 = 160;
    for (var i = 0; i <= numLines; i += 1) {
        stroke(92,255,92); //green
        line(x1, y1, x2, y2);
        x1 += fx1;
        y1 += fy1;
        x2 += fx2;
        y2 += fy2;
    }

    //Stem
    strokeWeight(3);
    stroke(92,255,92); //green
    line(200,150, 200, 270)

    noLoop();
    

}

The most challenging part of this project was keeping track of which variables were being used and if the coordinates were right.

Looking Outwards – 04

I researched the project Supersynthesis by Amay Kataria in 2022 that was presented at Mu Gallery in Chicago. I like how the project emulates a wave of light and sound and how it allows the viewers to become users by digitally controlling the exhibit. The user can control how the exhibit shows light and sound in real time. It allows for community building and potentially performances by artists and/or musicians by controlling the exhibit. According to the artists the project is reminiscent of Olafur Eliasson’s Weather Project in 2003, which is alos about interacting with the community. The shape of the structure and interactivity of the project are manifestations of the artist’s interests. However, I wish the project had physical changes that respond to the user besides light movement and sound change. Perhaps something like raising and lowering of the bars when it lights up and emits sound or perhaps it could change colors.

Light & Sound Synthesis: In Conversation with Amay Kataria – CreativeApplications.Net

Blog 04: “Sound Machines”

By Ilia Urgen
Section B

“A visual instrument to compose and control electronic music in a comprehensive and responsible way.” – MediaArtTube, January 28, 2012. I love how this modern audiovisual concept is based on a timeless design used throughout the greater portion of the 20th Century – the record player.

I am truly fascinated and intrigued by this stunning piece of technology. As quoted by the creator, Sound Machines consist of 3 units, each unit resembling Vinyl record players. Each unit has the capacity of holding 3 tracks, just like traditional record players.

MediaArtTube, however, embodies this classical design with a 21st Century makeover. There is no direct contact of the needle to the groove in the disc in a Sound Machine. Signals received from the laser light of the “needle” is synced to a sequencer, producing a sound output.

Sound Machines are definitely a cool way to mix various digitally-transmitted tracks together, and I hope that we continue to see a greater implementation of this technology in everyday life.

(YouTube link: https://www.youtube.com/watch?v=_gk9n-2lBb8)

One of three Disc Units of a Sound Machine. Each color represents the different sound files electronically stored on the disc.

Project 04: Illuminati out of the Pyramid!

sketchDownload
// Ilia Urgen
// Section B

var numLines = 60;

function setup() {
    createCanvas (400,300); 
    
    color_1 = color (135,206,245);
    color_2 = color (253,217,181);

    // ombre 
    for (var y = 0; y < height; y++ ) {
        n = map (y,0, height, 0, 1);
        var color_3 = lerpColor (color_1, color_2, n);
        stroke (color_3);
        line (0, y, width, y);
    }

    stroke (0);

    // canvas border lines
    line (1,1,1,299);
    line (1,1,399,1);
    line (1,299,399,299);
    line (399,1,399,299);
} 

function draw() {
    
    for (var i = 0; i <= numLines; i += 1) {

        var x1 = 0;
        var y1 = i * width / numLines;
        var x2 = i * height / numLines;
        var y2 = 300;

        // lower eye curve
        strokeWeight(0.5);
        line (x1, y1, x2, y2);
        
        // upper eye curve and pyramid side
        line (x2, 0, 400, y1);

        // lower eye lines
        line (y1, x2 * 1.05, x1, y1 - 20);

        // pyramid face
        line (y2, x2, x1 - 2.1, y2 + 200);

        // eye line
        strokeWeight(2);
        line (0,0,382,303);
        
    }

    //illuminati eye
    push();

    translate (185, 152);
    rotate(radians(38));
    fill (0);
    ellipse (0,0,180,80);
    fill (255);
    ellipse (0,0,80);
    fill (0);
    ellipse (0,0,20);

    pop();

noLoop();
}

LO 4

By MycoLyco

The cellular activity of the fungi used in this person’s videos produces a bioelectrical current which is translated into noise. The
hardware used is called Eurorack Module SCION from Instruō. The noises produced have an interesting sound to them as far as generated
music goes. They sound intentional, almost as though the artist tried creating the most stereotypically alien noises by himself. It was
to the point where I was suspicious of their claims that these were sounds generated by the mushrooms. The presentation of the mushrooms
in the videos, with their fluorescent lighting makes me think that the them being excessively alien is the point.

Blog 4

Don Ritter’s piece, Intersection creates an experience that truly tests and separates the sense of hearing from the others. In a pitch dark room, the participant is to walk into a room with nothing but sounds. Sounds of cars passing by, as if you were in the middle of a busy intersection. This experience that lacks sight, and essentially all three other senses as well really gets the participant to listen. I assume the purpose is to create an experience that evokes anxiety and fear, scared that a car is approaching but not knowing from where. Like many other projects of his, like These Essences, aims to unsettle viewers with extremely vivid sounds that are in some cases assisted with very unusual and textural images. The combination of two and sometimes just the sound creates sounds that are beyond just sound, but enough to very effectively penetrate the visitors’ mind through sound. The way these sounds are created are likely using visuals that accompany the sounds to engineer the sounds so that just the sounds alone can paint or evoke the feelings of the visuals much more powerfully than the image can. I think it is a combination of AI that understands sounds and codes that amplifies certain patterns of sound that we are sensitive to.

https://aesthetic-machinery.com/compilation.html