LO 4

By MycoLyco

The cellular activity of the fungi used in this person’s videos produces a bioelectrical current which is translated into noise. The
hardware used is called Eurorack Module SCION from Instruō. The noises produced have an interesting sound to them as far as generated
music goes. They sound intentional, almost as though the artist tried creating the most stereotypically alien noises by himself. It was
to the point where I was suspicious of their claims that these were sounds generated by the mushrooms. The presentation of the mushrooms
in the videos, with their fluorescent lighting makes me think that the them being excessively alien is the point.

Blog 4

Don Ritter’s piece, Intersection creates an experience that truly tests and separates the sense of hearing from the others. In a pitch dark room, the participant is to walk into a room with nothing but sounds. Sounds of cars passing by, as if you were in the middle of a busy intersection. This experience that lacks sight, and essentially all three other senses as well really gets the participant to listen. I assume the purpose is to create an experience that evokes anxiety and fear, scared that a car is approaching but not knowing from where. Like many other projects of his, like These Essences, aims to unsettle viewers with extremely vivid sounds that are in some cases assisted with very unusual and textural images. The combination of two and sometimes just the sound creates sounds that are beyond just sound, but enough to very effectively penetrate the visitors’ mind through sound. The way these sounds are created are likely using visuals that accompany the sounds to engineer the sounds so that just the sounds alone can paint or evoke the feelings of the visuals much more powerfully than the image can. I think it is a combination of AI that understands sounds and codes that amplifies certain patterns of sound that we are sensitive to.

https://aesthetic-machinery.com/compilation.html

Looking Outwards 04: Sound Art – MilkDrop

One of the areas of sound and computation I find interesting is audio visualization. One of the plug-ins that makes audio visualization possible is a program called MilkDrop. It was created in 2001 by Ryan Geiss. It creates abstract art from songs. It is created through the use of 3d graphics hardware, image rendering techniques, and multiple equations that creates custom shapes and waves. The reason I find it interesting is that I have always been fascinated by the idea of Synesthesia when people can see sound. Although I can’t visualize sound, I associate certain songs/ genres of songs with certain color pallets. I am curious about how this program works because how music makes one person feel is very subjective. For example, there are some genres of music that I would associate with a certain color, however other people might disagree. Would the visualization be an interpretation of the creator’s thought process or is it customizable somehow to each user? If it were to be customizable how would the algorithm change? Would just the color change or the shapes as well? Since songs make everyone feel different emotions, interpretations also change.

http://www.hitsquad.com/smm/programs/MilkDrop/

Looking Outwards 04: Sound Art

https://www.creativeapplications.net/maxmsp/forms-string-quartet/
Playmodes
FORMS – String Quartet
2021

The project “FORMS – String Quartet” uses an algorithm (a “realtime visual score generator” called  “The Streaming Bot”) to generate great visual representations of various forms including lines and shapes as graphic scores, which are then converted into actual sounds and music.

“FORMS – String Quartet”, which is a performance in its eventual form, is derived from the works of “Screen ensembles”. In “Screen ensembles”, graphics that work as scores were created by the algorithm and were later converted into different sounds. Each screen represents a different role like “Rhythm”, “Harmony”, or “Texture”.

Screen Ensembles

One thing that I appreciate and admire a lot about this project is that it liberated the music scores from the traditional, rigid, black and white styles and transformed them into more flexible, diverse, and artistically-attractive visual forms. Also, they try to generate music from random graphics and shapes, which I think is an innovative attempt.

Looking Outwards-04

The Project I choose is “Light & Sound Synthesis: In conversation with Amay Kataria”. This is an interactive art device.
With the incorporation of a custom program, the audience will become part of the exhibition, and they can control light and sound with their own opinion.
The most interesting part of this project is the creator uses light and sound to create a connection between space, art, and the human mind.
The program and digital interface are able to store every different visitors’ thoughts when they are experiencing the project, and the stored data also influence the surrounding environment.
I also admire that the author tried to think about what is the maintenance of thoughts.

here is the link: https://www.creativeapplications.net/environment/light-sound-synthesis-in-conversation-with-amay-kataria/

LookingOutwards-04

I was looking at googles new AR Synthesizer project. They’re experiementing with a form of AR music production by replicating famous synthesizers with CAD and AR. I think its an interesting step into the VR and AR world, with facebook pushing the Metaverse so much those things are becoming increasingling important. It’s not too far away that virtual raves and dance parties with VR DJing are reality. Especially if mokeypox picks up and people are stuck inside again.

It’s also interesting on the topic of recent access to music production materials through technology. Desktop producing on a laptop is extremely viable with the advancements is software and computing most people have access to make music. Obviously AR and VR aren’t very accessable right now but maybe in 20 years everyone will be utilizing AR to cheaply use complex sound engineering tools.

https://artsandculture.google.com/story/7AUBadCIL5Tnow

Project-04: String Art-Section B

All Seeing Eye.

sketch
/*
* Evan Stuhlfire
* estuhlfi@andrew.cmu.edu
* Section B
*
* Project-04: String Art
* This program uses geometric shapes to create
* string art.
*/

function setup() {
    createCanvas(400, 300);
    background(73, 80, 87); //davys grey from coolors.com

}

function draw() {   
    // draw the string lines for the frame in light blue
    stringLeftLower(204, 255, 255, 55);
    stringLeftUpper(204, 255, 255, 55);
    stringRightUpper(204, 255, 255, 55);
    stringRightLower(204, 255, 255, 55);

    // move origin of canvas to center
    translate(width/2, height/2);

    // draw the circle design with light blue
    stringCircle(204, 255, 255); 
    noLoop();
}

/* draw the string pattern at the lower left */
function stringLeftLower(r, g, b, t) {
    var numLines = 40;
    var x1 = 0; // start at top right
    var y1 = 0; // top right
    var y2 = height;
    var x2 = xInc = (width/numLines);
    var yInc = (height/numLines);

    stroke(r, g, b, t);
    // iterate over each line to draw
    for (index = 0; index < numLines; index ++) {
        line(x1, y1, x2, y2);
        y1 += yInc;
        x2 += xInc;
    }
}

/* draw the string pattern at the upper left */
function stringLeftUpper(r, g, b, t) {
    // set vars to start at lower left and draw up
    var numLines = 40;
    var x1 = 0;
    var y1 = height; // lower left
    var y2 = 0;
    var x2 = xInc = width/numLines;
    var yInc = height/ numLines;

    stroke(r, g, b, t);
    // iterate over each line to draw
    for (index = 0; index < numLines; index ++) {
        line(x1, y1, x2, y2);
        y1 -= yInc; // move up the canvas
        x2 += xInc; // move across the canvas
    }
}

/* draw the string pattern at the upper right */
function stringRightUpper(r, g, b, t) {
    var numLines = 40;
    var x1 = xInc = width/numLines;
    var x2 = width;
    var y1 = 0;
    var y2 = 0;
    var yInc = height/ numLines;

    stroke(r, g, b, t);
    // iterate over each line to draw
    for (index = 0; index < numLines; index ++) {
        line(x1, y1, x2, y2);
        y2 += yInc; // move down the canvas
        x1 += xInc; // move right across the canvas
    }
}

/* draw the string pattern at the lower right */
function stringRightLower(r, g, b, t) {
    // set variable
    var numLines = 40;
    var x1 = width; // right side
    var x2 = 0;
    var xInc = width/numLines;;
    var yInc = height/numLines;
    var y1 = height - yInc; // bottom right
    var y2 = height;

    stroke(r, g, b, t); // set color and transparency
    // iterate over each line to draw
    for (index = 0; index < numLines; index ++) {
        line(x1, y1, x2, y2); 
        y1 -= yInc; // move up the canvas
        x2 += xInc; // move right across the canvase
    }
}

/* draw the center string circle */
function stringCircle(r, g, b) {
    // 36 spokes on the circle design
    var circlePoints = 36;
    var angle = 0;
    var rotDeg = 0;

    // iterate over each spoke
    for (index = 0; index < circlePoints; index++) {
        // save settings
        push();

        // map the angle to the perimeter of the circle
        angle = map(index, 0, circlePoints, 0, TWO_PI);

        // convert angle to x y coordinates
        var radius = 90;
        var circleX = radius * cos(angle);
        var circleY = radius * sin(angle);

        // move origin to the starting point of the circle
        translate(circleX, circleY);

        // rotate each spoke to the origin
        rotate(radians(rotDeg));

        // variables for drawing string design
        var circleX2 = -radius * 2;
        var circleY2 = 0;
        var smallCircleDiam = 10;
        var offset = 15;

        // draw small circles at end of spokes
        stroke(r, g, b, 255);
        circle(0, 0, smallCircleDiam * .2);
        noFill();
        circle(0, 0, smallCircleDiam); // outline

        // set stroke color and decrease transparency to
        // see more detail.
        stroke(r, g, b, 125); 

        // draw three lines from each perimeter point to
        // create spokes
        line(0, 0, circleX2, circleY2);
        line(0, 0, circleX2, circleY2 + offset);
        line(0, 0, circleX2, circleY2 -offset);

        // extend lines off of spokes
        stroke(r, g, b, 50);
        line(0, 0, offset * 8, circleY2);
        line(0, 0, offset * 8, circleY2 + offset);
        line(0, 0, offset * 8, circleY2 -offset);

        // call function to draw the background circles with
        // transparancey
        backgroundCircles(index, offset, r, g, b, 80);

        pop(); // restore settings 
        rotDeg += 10; // rotate 10 degrees 36/360
    }
}

/* draw the background circles with design */
function backgroundCircles(index, offset, r, g, b, t) {
    // save settings
    push();
    stroke(r, g, b, t); // light blue with transparency
    // rest origin, space circles out
    translate(25, 0);

    // draw small inner circle on even spoke
    if (index % 2 == 0) {           
        circle(0, 0, 20);
        circle(110, 0, 70);
    } else {
        var diam = offset * 4; // set diameter
        // draw bigger circle on odd spoke
        circle(offset * 3, 0, diam);

        // string design of four circles inside each 
        // bigger circle
        var shiftValue = 10;
        circle(offset * 3, -shiftValue, diam/2);
        circle(offset * 3, shiftValue, diam/2);
        circle(offset * 3 + shiftValue, 0, diam/2);
        circle(offset * 3 - shiftValue, 0, diam/2);
    }
    pop();// restores settings
}

Looking Outwards 04 – Sound Art

Material Sequence – Physical materiality of sound
Mo H Zareei
19/11/2021

material sequencer (aluminium)
material sequencer (copper)
material sequencer (steel)

The project I’ve chosen is the “Material Sequencer: Physical materiality of sound” by Mo H Zareei. The sequencer itself is quite simple, composed mainly of 4 parts: 1, control switches; 2, onboard dial; 3, actuator/solenoid; 4, material block. The control switches inform the beat pattern within a 8 step rhythmic sequence in which the solenoid strikes the material block; the onboard dial controls the tempo of the sequence; while the material block determines the timbre of the sounds generated. By just watching it operate, the functions of each component are easily discernible, laying out the process of conversion from input to electrical to kinetic to output via sound in a clear and easily digestible manner, demystifying an otherwise ‘black box’ process. I admire the simplicity of the material artifact and the elegance with which it incorporates exploration of the sound profile of various materials. Zareei positions this work as being “a reductionist celebration of unadorned raw material through rigorous functionalism” that takes “the sequencing process outside the black box and into the acoustic realm, flaunting its materiality and physicality.”

material sequencer (wood)

From my understanding, each switch has a individual hard coded behavior(ie. only firing once every 5, 7, 10, etc. seconds), so toggling different switches at the same time allows for various combinations of beat patterns as a result of mixing different behaviors. The creator’s artistic sensibilities are inflected in the final sonic output via the choices of materials, switch combination, choice of the solenoid, and interaction with the tempo dial such that the output is able to manipulated according the user’s preferences.

Links:
https://www.creativeapplications.net/sound/material-sequencer-physical-materiality-of-sound/

LO 4 – Sound Art

I really like the albums that the Fat Rat, a music producer, made because I really like his style and his unique use of those special sound effects. (I’ll put a link for one of the songs he made (Unity) here) Most digital music producers always use a DAW (digital audio workstation) to make their music, and I would briefly explain how those producers use a DAW to make their music. First, they put down chords on some specific track lists (that look like a timeline) using pre-recorded chords of different instruments (which the Fat Rat mostly uses pre-recorded game sound effects, usually electronic music using programmable sound generator sound chips) in order like building blocks and then, they arrange those musical sentences (chords) so it all fits together to form a rhythm. After arranging those notes, producers have to adjust the audio levels of each track list (usually different instruments) to make them sound right, and this process is called the mix stage. In between the stages, the producer can easily add effects or change chords (like laying on top of the base note), making the music original and flexible to changes. 

One example of a DAW Program – Cubase

Looking Outwards-04-Section B

Experiments in Musical Intelligence, EMI or EMMY, is a program that analyzes musical compositions and generates entirely new compositions that emulate the sound, style, mood, and emotion of the original. Written by composer, author, and scientist, David Cope, this project allows entirely new compositions to be algorithmically generated in the style of any composer. Compositions have been generated in the styles of Bach, Beethoven, Chopin, and many more, including Cope himself. In fact, Cope’s original inspiration for the software project was writer’s block. He was stuck and wanted to identify his own compositional style.

Although the software is data driven and bases its compositions on works by the original composer, it never repeats or copies the original work. The compositions generated are unique. Cope’s software deconstructs the original works; then records their time signatures. The final step runs the data through a recombinant algorithm for which Cope holds a patent.

This project was truly revolutionary for its time. It inspires questions about creativity and the mind. Originally written in the LISP programming language in the 1980s, it has been modified to use AI techniques as they have advanced. Interestingly, the generative compositions have been used in a type of Turing Test. One particular test set out to see if audiences could identify which of three compositions was actually composed by Bach, which was an emulated composition written by a human, and which was generated by a computer. Audiences chose the EMMY generated composition as the actual Bach. Perhaps EMMY is the first piece of software to pass the Turing Test.

To learn more about David Cope and EMMY click here and here.