Looking Outwards – 04

“5 Horizons” Installation by Ryoichi Kurokawa

The artwork “5 Horizons” is an audiovisual installation created by Japanese artist Ryoichi Kurokawa. Created in 2010, it is made up of five screens and speakers. The experince is an eight minutes long loop. He builds his structures with recordings and digitally-made structures. Kurokawa synthesizes the image to the sound, creating a very cool, immersive spatial experience for the viewers. I admire the layering of this installation, not only between the sound and imagery, but also how the imagery is able to morph (from dynamic waves to nature scenes and so on)! It is very well executed for all the moving parts. With further research, Kurokawa is interested in synesthtesia which you can see come though his work as he blends together the visual and audtiory senses. He also is inspired by nature, and it very interesting to see how he blurs the boundary between the natural and digital world in this installation.

Title : 5 Horizons
Artist : Ryoichi Kurokawa

Links : http://www.ryoichikurokawa.com/project/r5h.html
https://mutek.org/en/artists/ryoichi-kurokawa

Looking Outward-04 Sound Art

Link: https://www.youtube.com/watch?v=_gk9n-2lBb8&ab_channel=MediaArtTube

I looked at the project The Prouduct – Soundmachines, where it is three rotating disks mimicking the form of a vinyl disk player. Different from a traditional disk player, the disk contains visual information displayed through color and black and white patterns and the receiver reads these patterns and returns a sound for each color. The project includes three of these disks, each holding different visual information.

What I really admire about the project is its abstraction of sound, where in many music software, sound is displayed in a sonogram, which is often harder for people of comprehend. However, the abstraction of the audial qualities into easy-to-understand visual patterns make very simple for one to understand the audio qualities which otherwise is illegible.

The algorithm as the description as said, “translates concentric visual patterns into control signals.” I assume the receiver would reads the color values on the disk as the disk rotates in concentric motion, in the algorithm, each color would represent a different signal input into a music software, and outputs a sound through the software.

The artist’s sensibilities with both visual and audible qualities is clearly expressed in the project, where the artist uses simple geometric shapes to create a symphony of sounds. The artists is also most likely a follow of digital music, where the output comes from a music sequencer.

lab 212 – passifolia

video by Lab 212 demonstrating the installation in action

Passifolia is an audiovisual installation by Lab 212, exhibited in Paris in 2020. Visitors enter a dark and hazy room shaped by 16 vertical light beams. When one steps into a light beam, a camera sensor detects the change in pixels, the light beam widens, and directional speakers play a pre-composed melody of bird songs and ambient nature sounds, and the beam narrows again. I appreciate how this invites the visitor to explore the space and relish in the sounds of nature, literally spotlighted. While the sound itself is not computationally generated, the layers of signal processing are numerous (i.e. sending a Midi note to Ableton Live, to the light projector, and to the directional speakers).

visitors exploring the installations’ light beams

getting stringy

sketch
// Jaden Luscher
// jluscher
// section A
// project 4: string art

// increments along lines
var dx1;
var dy1;
var dx2;
var dy2;
var dx3;
var dy3;
var dx4;
var dy4;

// line density
var numLines = 50;

// determines location / quadrant of strings1 function
// when mycount = 0, the top left quadrant is drawn
var mycount = 0;

function setup() {
    createCanvas(400, 600);
    background(200,200,0);
}

function draw() {
  dx1 = (50-100)/numLines;
  dy1 = (150-300)/numLines; // split up L1

  dx2 = (200-200)/numLines;
  dy2 = (250-50)/numLines; // split up M1

  dx3 = (350-300)/numLines;
  dy3 = (150-300)/numLines; // split up R1

  dx4 = (350-50)/numLines; // split up horizontal line
  dy4 = (600-0)/numLines/2;  // split up long vertical (height)

  // brown lines
  push();
  strokeWeight(0.2);
  stroke("brown");
  bgStrings(350, 300, 200, 0);
  bgStrings(350, 300, 400, 0);
  bgStrings(50, 300, 200, 0);
  bgStrings(50, 300, 0, 0);
  pop();

  stroke(255);
  strokeWeight(0.7);
  outline(); // draw base lines (L1, M1, R1, L2, M2, R2)

  push();
  strings1(50, 150, 200, 250);  // call function to connect L1 to M1
  mycount = 1;
  strings1(350, 150, 200, 250);  // connect R1 to M1

  translate(width, height); // flip canvas to mirror strings
  rotate(PI);
  mycount = 0;  // reset to 0
  strings1(50, 150, 200, 250);  // L1 to M1
  mycount = 1;  // string1 uses "else" conditional do draw top right quadrant
  strings1(350, 150, 200, 250);  // R1 to M1
  pop();

  // lines at center of canvas (resemble parallelogram)
  strings2(200, 250, 50, 300);
  strings2(200, 350, 50, 300);

  noLoop();
}

function outline() {
  // top lines
  line(50, 150, 100, 300);  // L1
  line(200, 250, 200, 50);   // M1
  line(350, 150, 300, 300);   // R1

  // bottom lines
  line(50, 450, 100, 300);  // L2
  line(200, 550, 200, 350);  // M2
  line(350, 450, 300, 300);   // R2

  push();
  strokeWeight(2);
  line(width/2, 0, width/2, 250);
  line(width/2, height, width/2, 350);
  pop();

  // other lines
  line(0, height/2, width, height/2);   // horizontal line
}

function bgStrings(a, b, x, y) {
  for (var i = 0; i <= numLines*2; i += 1) {
    line(a, b, x, y);
    y += dy4;
  }
}

function strings1(x1, y1, x2, y2) {
  for (var i = 0; i <= numLines; i += 1) {
    line(x1, y1, x2, y2);
    if(mycount == 0) {  // top left quadrant
      x1 -= dx1;
      y1 -= dy1;
      x2 -= dx2;
      y2 -= dy2;
    } else {          // top right quadrant
      x1 -= dx3;
      y1 -= dy3;
      x2 -= dx2;
      y2 -= dy2;
    }
  }
}

function strings2(a, b, x, y) {
  for (var i = 0; i <= numLines; i += 1) {
    line(a, b, x, y);
    x += dx4;
  }
}

Looking Outwards 04: Sound Art

SuperSynthesis: Amay Kataria (March 11, 2022 to April 1, 2022

The light and sound supersynthesis by Amay Kataria was extremely inspirational  and intriguing. As a designer I know the importance of light and how it is interpreted by our target audience, however, I had never thought about exploring its synergy with sound. I really admire how individual users around can influence the light and sound generated through the parametric structure, and how it creates an environment of meditation and immersiveness. I also like this particular work as I myself have grown up in a traditional Indian household, where sounds in a spiritual context are meditative experiences, just like they were for the artist. I also admire how the parametric structure was derived from waves into a ‘non-archetypical form’, which further boosted the immersiveness of the whole project in my opinion. The fact that light and sound are treated as the most sensory phenomenon, combining the two really made sense. I do have base level knowledge of how the lights were created and connected to the architecture of the structure, but I am not sure about how the application of algorithms work with each individual user controlling one aspect of it. I feel that the artist’s vision manifested really well, and turned out just like he had envisioned it, and that is creating a ‘supersynthesis’ between light and sound, thus creating an immersive experience.

Link

lookingoutwards-04-sectionA

Ralf Baecker’s “Floating Codes” is an immersive and abstract piece of sound and light installation art. It’s made up of many different hexagonal shapes that are constructed by flat metal sticks laid on the ground with sticks of different heights jutting upwards at the vertices of the hexagons. The jutting sticks of lights and speakers on them that flash and make noise through a process that’s a neural network using fundamentals of machine learning and artificial intelligence. “Depending on the topology of the neurons in the grid different networks constitute interlocking circles, feedback loops, memory-like elements, random pattern generators, and other significant behavioral elements.” Thus, completely different manifestations of noises and lights flash according to each other’s behavior to create a chaotic environment for each neuron/perceptron making each other react. “The open and unsupervised system has no objective, its only goal is to maintain and conserve the propagating information in the network.” I love this installation because it utilizes machine learning to create this emulation of propagation and quasi-symbiosis that almost makes it feel like each neuron is part of a greater whole–like a community or colony of organisms. It’s robotic (both the noises and lights) yet lively (in like a cricket way), riding the fine line/liminal space between artificial and natural.

String Art?

sketch
//Emily Brunner, ebrunner
//Section C

function setup() {
    createCanvas(400, 300);
    background(200);

}

function draw() {
background(200);
for(i = 0; i <= width; i += 10){                 //DARK RED LINES
    stroke("darkred");
    line(width/2 - width, height/2, i, 0);
    line(width/2 - width, height/2, i, height);
}
for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 + width, height/2, i, 0);
    line(width/2 + width, height/2, i, height);
}
for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 + width/2, height/2, i, 0);
    line(width/2 + width/2, height/2, i, height);
}
for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 - width/2, height/2, i, 0);
    line(width/2 - width/2, height/2, i, height);
}
for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 + width/3, height/2, i, 0);
    line(width/2 + width/3, height/2, i, height);
}for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 - width/3, height/2, i, 0);
    line(width/2 - width/3, height/2, i, height);
}
for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 + width/4, height/2, i, 0);
    line(width/2 + width/4, height/2, i, height);
}for(i = 0; i <= width; i += 10){
    stroke("darkred");
    line(width/2 - width/4, height/2, i, 0);
    line(width/2 - width/4, height/2, i, height);
}



for(i = 0; i <= width; i += 10){         //YELLOW LINES
    stroke(246, 190, 0);
    line(width/5, height/2, i, 0);
    line(width/5, height/2, i, height);
}
for(i = 0; i <= width; i += 10){
    stroke(246, 190, 0);
    line(width/3 + width/2, height/2, i, 0);
    line(width/3 + width/2, height/2, i, height);
}


for(i = 0; i <= height; i += 10){             //ORANGE LINES
    stroke("orange");
    line(width/5, height/2, 0, i);
    line(width/5, height/2, width, i);
}
for(i = 0; i <= width; i += 10){
    stroke("darkorange");
    line(width/3 + width/2, height/2, 0, i);
    line(width/3 + width/2, height/2, width, i);
}
}

I had trouble with the math of this project, and understanding how to get the lines to do what I wanted. I didn’t get to draw a picture that was similar to what I pictured or found on the internet, but I made something that visually looks appealing so I call that a win.

LO4: FORMS!

FORMS – String Quartet, 2021 https://vimeo.com/553653358?embedded=true&source=vimeo_logo&owner=7721912

FORMS is a live event where musicians interpret and play scores that are randomly generated by a bot in real time. Each instrument corresponds to a certain color, displayed on screen, and the result is a comprehensive piece combining both randomized audio and visual elements. I think this piece piqued my interest because recently I’ve been toying with the idea that certain movements “lock” and “unlock” events that occur. There’s specifically a video piece that I’m working on at the moment in which a specific twist of the wrist prompts a four minute long rendition of “One Day More” from Les Mis performed by hand puppets. While this is certainly not random, I think the ideas are pretty analogous, especially in terms of audio/visual component, and especially in the way that a random notation is able to elicit a programmed response. In terms of how this (FORMS, not Les Mis puppetry) functions algorithmically, it seems pretty clear that a series of shapes are randomly notated, each with a corresponding color, and likely shifted in positionality to notate rests, changing of notes, etc. Because there are so many (infinite) shapes, and notating a specific number would perhaps be unwise for practicality reasons, it might be possible that for loops were utilized? Don’t quote me on that, but I think otherwise the work would be quite tedious.

Shot of the quartet along with the randomized score notation.

Project 04 String Art

For this project, I used the curves generated by the strings to make a sort of landscape. Because the shapes generated were pretty organic, I thought it would be fitting to make the subject matter organic as well.

sketch

var dx1;
var dy1;
var dx2;
var dy2;
var dx3;
var dy3;
var dx4;
var dy4;
var dx5;
var dy5;
var dx6;
var dy6;
var dx7;
var dy7;
var dx8;
var dy8;
var numLines = 50;

function setup() {
    createCanvas(400, 300);
    background(237, 233, 223);
    dx1 = (150-50)/numLines;
    dy1 = (300-300)/numLines;
    dx2 = (400-400)/numLines;
    dy2 = (100-400)/numLines;
    dx3 = (400-400)/numLines;
    dy3 = (100-400)/numLines;
    dx4 = (500-600)/numLines;
    dy4 = (200-500)/numLines;
    dx5 = (200-400)/numLines;
    dy5 = (0-500)/numLines;
    dx6 = (200-500)/numLines;
    dy6 = (200-500)/numLines;
    dx7 = (200-400)/numLines;
    dy7 = (20-500)/numLines;
    dx8 = (400-500)/numLines;
    dy8 = (200-300)/numLines;
}

function draw() {
    var x1 = 400;
    var y1 = 0;
    var x2 = 0;
    var y2 = 300;
    var x3 = 0;
    var y3 = 300;
    var x4 = 500;
    var y4 = 300;
    var x5 = -175;
    var y5 = 150;
    var x6 = 500;
    var y6 = 700;
    var x7 = 3000;
    var y7 = 300;
    var x8 = 0;
    var y8 = 400;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight(1);
        stroke(229,193,208);
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
        stroke(249,199,169);
        line(x3, y3, x4, y4);
        x3 += dx3;
        y3 += dy3;
        x4 += dx4;
        y4 += dy4;
        stroke(241,144,167);
        line(x5, y5, x6, y6);
        x5 += dx5;
        y5 += dy5;
        x6 += dx6;
        y6 += dy6;
        stroke(221,115,81);
        line(x7, y7, x8, y8);
        x7 += dx7;
        y7 += dy7;
        x8 += dx8;
        y8 += dy8;
    }
    noLoop();
    fill(221,81,108);
    strokeWeight(0);
    triangle(300, 150, 325, 220, 275, 220); //tree to the right
    rect(290, 220, 20, 30);
    triangle(0, 100, -50, 290, 50, 290); //big tree in foreground
    rect(0, 290, 20, 300)
    triangle(170, 150, 165, 165, 175, 165); //tiny tree far away
    rect(168, 165, 4, 8)
    fill(255,192,0);
    ellipse(100, 40, 50, 50);
}

Looking Outwards – 04

The art I’ll be reviewing today is Ryoji Ikeda’s “Datamatics”. This piece explores the how we can possibly perceive multi-substance data that though present in out world; humans otherwise cannot comprehend. The piece uses pure data as a source for generating both sounds and visuals that help present time and space. The graphics are intentionally minimal and give a natural feel that you are looking at or hearing data. I’m not entirely sure how any of the data is technically generated, though I know the technologies used to collect the data. For starters, audio data can be collected through a mic and registered through pitch, volumes, harmonics, etc., these mics can also use AI learning to detect reverberance, decay, echoes, space, timbre, etc. This can then create visuals or separate audio based on the data that is a pure translation of the data. This can mean it can make a few clicks of data points, signals, analog waves, harmonies, pitches, etc. I really admire this art process as it links how we perceive the world around us to the actual real data behind it, comparing the two perceptions and letting them exists together for a moment.

https://www.ryojiikeda.com/project/datamatics/