anabelle’s blog 04

Since middle school, I’ve always been a huge Vocaloid fan. Vocaloid is a software used to simulate human singing and was originally created to help performers and producers add vocal music even if they did not have access to a real singer. However, Vocaloid expands beyond the software and includes “characters” for each voicebox/voicekit, with popular examples including Hatsune Miku and Kagamine Len&Rin. Initially released in 2004 in a project led by Kenmochi Hideki in Barcelona, the Vocaloid software continues to be updated and rereleased today, with numerous versions and iterations of the same character with new singing abilities. If popular enough, Vocaloids are also given 3D holograms that are capable of holding real-life concerts in real-life venues (with really great turnout). I think the algorithms to create Vocaloid are fairly simple — a commissioned singer records the base notes for a character, which can be modified and edited by producers. What I love about Vocaloid is how each character is given vocal “limitations” to produce their own unique sounds. For example, Teto is best used for soft, low energy ballads, and Kaito’s deeper range will sound distorted in higher ranges.

Here’s an example of a vocaloid concert — the turnout is actually crazy for these things:

Link to Vocaloid website (anyone can buy the software): https://www.vocaloid.com/en/

Looking Outwards-04

Since starting to research different computational artists for future blog posts, I have been amazed by the flexibility and application of coding software to art. This week’s topic, Sound Art, has led me to the discovery of Renick Bell, an American musician, programmer, and teacher based in Tokyo. Bell is famous for his live coded music performances, which have become a recent phenomenon in the underground electronic music scene known as ‘Algorave’.

Bell first used SuperCollider, a programming platform for audio synthesis and algorithmic composition, to create some applications which produced generative music. He realized he didn’t need the graphical part of interface, and began to focus more on the manipulation of symbols, creating abstractions. Bell started live coding with his self-built live coding system ‘Conductive‘, following a pattern of 130 to 160 bpm. I can see Bell’s artistic sensibilities manifest into his work because he creates a utopian and complex experience through his design, contextual symbols, and storytelling.

I admire Renick Bell’s live coding algorave performance at Algorave Tokyo in 2016. I’m amazed at how fast he is able to code, and also orchestrate the code on its own in order to a literal concert (which I wouldn’t mind attending). Bell has truly created a trans-disciplinary tool for innovative collaborative live coding.

Looking Outwards 04

Graham Murtha

Section A

In the project entitled “Material Sequencer” by Mo H. Zareei, mechanical processes and rhythm are synthesized to create a sort of exposed metronome. Many people own metronomes, but no one considers the complexity of the functions working inside the little box. This project provides a visual for users, as they are able to experience the mechanisms required to make consistent or dynamic rhythms. While watching demos of the machine in action, I noticed that the variability of the system took up the majority of the working parts. The metronome has a dip switch that, when pressed, alternates between 8 different rhythmic patterns, as well as a dial for tempo. The circuit board tranfers both of these inputs from the user and transfers them to a Teensy 3.2 board, which interprets the 8 patterns as on/off functions. In this vein, the machine conspicuously displays how programming plays a large role in a machine as simple as a metronome. The project also has a heavy emphasis on materiality- connecting raw materials that have been musically utilized for centuries with the new era of mechanism and programming.

Rowing boats

My string art uses loops to generate wave patterns to create an ocean. In the process, I am trying to control the lines carefully by adjusting the start and end points’ values.

//Jason Jiang
//Section E

//Setting variables 
var dx1;
var dy1;
var dx2;
var dy2;
var numLines = 10;

//Creating Sky
function setup() {
    createCanvas(300, 400);
    background(135, 206, 235, 50);
    }
    


function draw() {
    
    //Waves
    stroke(0, 100, 184);
    var x1 = -30;
    var y1 = 300;
    var x2 = 600;
    var y2 = 500;
    strokeWeight(0);
    line(-30, 300, -50, 400);
    line(600, 300, 400, 500);
    strokeWeight(1);
    dx1 = (-50-(-30))/numLines;
    dy1 = (400-300)/numLines;
    dx2 = (600-300)/numLines;
    dy2 = -(500-300)/numLines;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight(i*0.1)
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    }
    

    x1 = 0;
    y1 = 300;
    x2 = 500;
    y2 = 500;
    strokeWeight(0);
    line(0, 300, -50, 400);
    line(400, 150, 500, 500);
    strokeWeight(1);
    dx1 = (-50-0)/+numLines;
    dy1 = (400-300)/numLines;
    dx2 = (500-400)/numLines;
    dy2 = -(500-150)/numLines;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight(i*0.1)
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    }


    x1 = -100;
    y1 = 300;
    x2 = 400;
    y2 = 300;
    strokeWeight(0);
    line(-100, 300, -250, 500);
    line(300, 200, 400, 300);
    strokeWeight(1)
    dx1 = (-150+50)/+numLines;
    dy1 = (500-300)/numLines;
    dx2 = -(500-400)/numLines;
    dy2 = -(300-200)/numLines;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight((numLines-i)*0.1)
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    
    }


    x1 = -100;
    y1 = 300;
    x2 = 400;
    y2 = 320;
    strokeWeight(0);
    line(-100, 320, -350, 450);
    line(200, 350, 150, 320);
    strokeWeight(1)
    dx1 = (-250-150)/+numLines;
    dy1 = (450-250)/numLines;
    dx2 = -(350-300)/numLines;
    dy2 = -(400-350)/numLines;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight((numLines-i)*0.1)
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    
    }
    

    x1 = -150;
    y1 = 200;
    x2 = 350;
    y2 = 350;
    strokeWeight(0);
    line(-150, 200, -250, 400);
    line(300, 300, 350, 350);
    strokeWeight(1)
    dx1 = (-250+100)/+numLines;
    dy1 = (300-100)/numLines;
    dx2 = -(500-400)/numLines;
    dy2 = -(300-200)/numLines;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight((numLines-i)*0.05)
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    
    }
    

    //Boat
    //Lower Part
    x1 = 85;
    y1 = 260;
    x2 = 130;
    y2 = 270;
    strokeWeight(0);
    line(85, 260, 90, 272);
    line(130, 270, 120, 280);
    strokeWeight(1)
    dx1 = 2*(90-85)/numLines;
    dy1 = 2*(272-260)/numLines;
    dx2 = 2*(120-130)/numLines;
    dy2 = 2*(280-270)/numLines;
    for (var i = 0; i <= 0.5*numLines; i += 1) {
        strokeWeight((numLines-i)*0.15)
        stroke(i);
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    
    }
    //Upper Part
    x1 = 92;
    y1 = 259;
    x2 = 120;
    y2 = 265;
    strokeWeight(0);
    line(92, 261, 110, 230);
    line(120, 267, 110, 230);
    strokeWeight(1)
    dx1 = (110-92)/numLines;
    dy1 = (230-261)/numLines;
    dx2 = (110-120)/numLines;
    dy2 = (230-267)/numLines;
    for (var i = 0; i <= numLines; i += 1) {
        strokeWeight((numLines-i)*0.15)
        stroke(i+100);
        line(x1, y1, x2, y2);
        x1 += dx1;
        y1 += dy1;
        x2 += dx2;
        y2 += dy2;
    
    }

    //Sun
    x1 = 200;
    y1 = 59;
     //Outer Ring
     for (var i = 0; i <= 10*numLines; i += 1) {
        stroke(253, 184, 19, 50)
        strokeWeight(1)
        push()
        translate(x1, y1);
        rotate(radians(i*180/numLines));
        line(0, 0, 0, 100);
        pop()
 
    }
    //Inner Ring
    for (var i = 0; i <= 10*numLines; i += 1) {
        stroke(200, 92, 60)
        strokeWeight(1)
        push()
        translate(x1, y1);
        rotate(radians(i*0.5*180/numLines));
        line(0, 0, 0, 50);
        pop()
    }

noLoop();

}

Light and sound

This project is all about sound visualization. Amay Kataria visualizes sound by constructing 24 light beams arranged like a synthesizer. Each time there is a sonic input, the corresponding light turns on by performing different notes, the light flickers and continuously changes its pattern. The light and sound are tied in a way that, fundamentally, they are all waves that mark their presence in the outer world. Besides, it also connects to the inner world, which is people’s minds, since user inputs control all patterns.

Performers controlling the installation

The artists showcase his artistic sensibility by developing a system that mimics the human brain, which receives inputs from multiple sources and stores it in the database, which is interpreted and expressed through light and sound. Personally, I like how the system takes into account all users’ input at the same time which emphasizes each individual as a part of the community. It connects individuals in an anonymous way that highlights the hive-mind of humanity.

LookingOutwards – 04

Christian Marclay is an artist and composer with a strong interest in collages. In 2018, he used Snapchat videos and audio to compose and visualize a series of projects made up of a collage of video cuts. In one project a collection of phones hanging from the ceiling; as people speak to them the software matches the frequency of their voice to Snapchat videos and displays them on the screen as a reply. In another, a piano’s keys are connected to sound clippings from Snapchat videos, and as a player presses the keys videos and their sound display on a large screen. All the projects in this collection are interactive; I think this makes it more interesting because Snapchat is an app most people are familiar with and his project series allows them to use it in an entirely different way.

Ryoji Ikeda – Test Pattern (2008)

Test Pattern (2008) is an audiovisual installation created by Japanese sound artist Ryoji Ikeda in collaboration with Tomonaga Tokuyama. The program behind Test Pattern picks up real-time audio signals and converts their sound waveforms into eight synchronized barcode patterns. Its results are then displayed on LCD screens accompanying 16ch sound systems that projected Ikeda’s futuristic, data-driven tunes. Flashing across the screen at speeds that peak at hundreds of frames per second, Ikeda’s installation is one that challenges the boundaries of human perception. While Test Pattern’s soundscape was originally composed by Ikeda, the installation was able to exist thanks to the work of Tokuyama, whose algorithm mapped sixteen channel sound signals onto a grid matrix, turning into blindingly hypnotic strobe flashes. In an article from Hero Magazine, Ikeda was described as a “cosmic polymath,” sonic scientist,” and “matrix shaman.” I can’t help but agree- As an artist, I’ve always been mesmerized by the idea of subliminal audio and creating work that exceeded human sensory limitations. Ryoji Ikeda’s Test Pattern is a direct testament of this– a direct testament to the possibility of a paradoxically serene audiovisual experience that brought its audience into fleeting moments of sensory transcendence.

https://www.ryojiikeda.com/project/testpattern/

Ryoji Ikeda, Test Pattern (2008)

‘FORMS – String Quartet’ by Playmodes

Alexia Forsyth

Section A

‘FORMS – String Quartet’ is an automated emulation of string musicians. Not only is the music beautiful, but the visual display is stunning and captures the essence of the song. The artists use a color code that identifies the instruments on the score and randomizes them. The generator is known as “The Steaming Bot”. The graphics are then transformed into sound. “The Steaming Bot” uses compositions from John Cage, Gyorgi Ligetti, Mestrews Quadreny, and Karlheinz Stockhausen. Every part of the network ensemble plays instrumental roles such as rhythm, harmony, and texture. The artist explains, “Images become sound spectrums, making it possible to hear what you see” (Playmodes 8). Their artistic sensibility is shown through the visualization of the musical cords and their relation to the cords.

Video: https://vimeo.com/553653358?embedded=true&source=video_title&owner=7721912

Srishty’s Looking Outwards: Sound Art

Purform’s White Box, Audio Visual Performance

The project I found is a performance called, “Puform; White Box.” This performance was programmed by Jean Sébastian Rousseau, and Peter Dines. The music is done by Alain Thibault, and visuals are done by Yan Breauleaux. The performance consists of three white rectangle screens, angled together to create a wall, displaying black parametric and geometric visuals transcend, twist, break, and many other transformations. When I first watched this video, the first thing that struck me was the way the audio sounded in different ways wearing earbuds and without. When I wore my airpods, I was able to hear the movement of the sound spatially because of its spatial audio feature. This made me think not only about the computational aspects of the audio within the project, but also the technology we use to interpret it. 

The music of the performanced matched the visuals tightly, creating a surreal and daunting experience. The visual artist correlated sharp breaks with the musician’s staccato notes, and created vibrations and faster tempo, based on the speed of the visuals. The darkness of the exhibit room allowed the transforming visuals to stand out as they were contrasted amongst a white background. 

The main software technique used by programmers is called white box testing. This is a testing technique in which the software’s internal structure, design, and coding are tested to verify input outflow, improve design, usability and testing. White Box is a new software based on an old way of generating A/V compositions in real time and is a new piece in a cycle that began in with Black Box, which exhibits inputs, outputs and transfer functions. Puform uses two layers mixed with their video tapes. Using Quartz composer compositions, the programmers can easily change the relationship between the music and video, as the piece is constructed with a database of clips using Lemur.

Sources: 

Supersynthesis

Hannah Wyatt Section A

Amay Kataria’s “Supersynthesis” artwork strives for a new form of connection through the unity of light and sound. I admire the interactive element, which allows visitors to first-hand control the wave, sound, and light patterns. Participants make edits in real-time, encouraging constant evolution of the piece, juxtaposing this innate action with advanced, intricate programming. The project assigns 24 pitches to 24 light sources arranged in leds across a physical wave, and Kararia intends to connect the audience with nature/eachother in this format. Through combining the interactions of all people with the website open, Kataria labels this event “communal computing” and as a new revolutionary method of social communication.