LO 10-Computational Music

The computational music project I looked at was begun by American bandleader, engineer and inventor Raymond Scott, and reimagined by Yuri Suzuki, a Japanese inventor. The machine is made to display an instantaneous performance-composition through the use of Google Magenta’s AI software, which connects neural networks from all the Bach chorales to code, thus creating a harmonic relationship between sounds and generates new situations with AI intelligence. The machine itself aesthetically displays a sequence and rhythm, which brings the performance a layer of visualization while it is playing. I think it is fascinating to see symphonic music composed and displayed in such a way, with all of its components on a screen light up to curate the performance. 

LO 10 – Computer Music

For this week’s Looking Outwards, I decided to take a look at the work of British singer-songwriter and audio engineer Imogen Heap. I specifically looked at her “MI.MU gloves” (2014) and the various ways that she uses this innovative computational musical instrument to compose and perform her music. MI.MU gloves are a wearable instrument that utilizes mapping technologies to translate hand movements into musical compositions. For instance, specific hand movements trigger changes in pitch, filters, and scales. The technical elements of the gloves include flex sensors, orientation sensors, a wifi device, and software that uses MIDI and OSC to coordinate movements and sounds.

I find this project fascinating and admire Heap’s work because of how revolutionary it is in bridging the gap between the analog and the digital through creating a more natural relationship between the artist and the computer. The gloves completely transform musical performance and experience, allowing artists to incorporate sound and movement seamlessly. They are also somewhat accessible, as they are available for purchase online, and many musical artists have used the gloves in their music/performances.

MI.MU gloves
how MI.MU gloves were developed
Imogen Heap performance; at 9:05, she describes and gives a demo with the gloves

LO 10: Computer Music

American Folk Songs Album

Benoit Carré is a french musician who created Skygga, his avatar alias, for AI-generated music. The Album “American Folk Songs” was released in 2019, using Flow Machines tools developed by Sony CLS. In this playlist, Carré “revisits American traditional folk songs with a prototype of an AI harmonization tool.” It takes acapella recording from many classic American folk singers and uses AI “to flesh out the melodies and the lyrics of the songs, enriching them with lush harmonies and sounds generated by AI that have never been heard before.” I really admire Carré’s work as it is an intriguing marriage of old folk music and modern electronic music. One song that I really enjoyed from the playlist is “Black Is the Color,” featuring the voice of Pete Seeger, a legendary folk singer. I was first introduced to Seeger during high school as he was a proud alumnus of the school. Hearing his voice with a new twist, therefore,  is very cool and interesting to me.

Creator: Benoit Carré

Year: 2019

Link: https://open.spotify.com/album/6NbX54oOpEZhSOjfdSYepw?si=qh6e45bQTMSh1LC4IX-R6w

LO – 10 Dubler Microphone

I am so excited that we are discussing sound synthesis in this class, since in my free time I produce music (for fun). Recently, I have been fascinated by sound synthesis and how computers can collaborate with a musician to make new sounds possible. In brainstorming what to write about for this LO, I thought of Andrew Huang, a Canadian YouTuber and music producer who is particularly well-known for taking samples of everything and making unexpectedly delightful compositions with these samples. His YouTube channel provides a wealth of entertaining information about production techniques and cool new gear he discovers, so it was hard for me to select a single thing to discuss in this blog. That being said, his most recent video about the Dubler microphone from Vochlea absolutely fascinated me.

This microphone ‘instantly’ turns audio input into MIDI information. The coolest thing about this is that you train the software to recognize the different sounds you sing into it, so you can beatbox into the mic and hear live playback of a full drum set. This is absolutely INSANE because this can transform the way a musician performs and records music. For example, a beatboxer could change the sound of their instrument (their mouth) in live performance by using this mic to control an 80s style drum kit or Skrillex style drum kit with a click of a button. Or you could record a virtual guitar solo with your voice if you don’t like playing a keyboard. I absolutely love how this invention uses technology to change the way people can manipulate sound in a way that has seemed like a fantasy until only recently. I think this relies heavily on the software’s ability to differentiate between different syllables and vowels. With recent development in audio controlled experiences like Alexa and Siri, it makes sense to me that technology is more capable than ever to precisely distinguish and interpret audio information.

Some honorable mentions for this blog post that showcase other musicians using electronics to challenge traditional music practices (some of my favorite videos of all time):

  • Electronic Music for ORCHESTRA – Composer David Bruce interprets electronic composition for an orchestra, challenging the way sound for orchestra is typically conceptualized and performed. BEAUTIFUL!!!

Project 9

sketchDownload
let img;


function preload() {
    img = loadImage("https://i.imgur.com/CPxq6MI.jpg");
}


function setup() {
    createCanvas(480, 480);
    background(110, 16, 0);
    img.resize(width, height);
    img.loadPixels();
    imageMode(CENTER);
    //image(img, 240, 240);
    frameRate(1000);
}

function draw() {
    let x = floor(random(img.width)); //x position of pixels
    let y = floor(random(img.height)) //y position of pixels
    let shapePixel = img.get(x, y); //get pixels from photo
    let textPixel = img.get(mouseX, mouseY); //get pixel from mouse location
    //randomly generated pixels that fill up the page
    let shapeR = random(15);
    noStroke();
    fill(shapePixel);
    circle(x, y, shapeR);


    //draw star shapes when mouse is pressed
    if (mouseIsPressed) { 
        fill(textPixel);
        textSize(random(5, 40));
        text('★', mouseX, mouseY); //star shape follows mouse
    }
}

//erases when double click
function doubleClicked() {
    fill(110, 16, 0);
    circle(mouseX, mouseY, 70);
}

This project is dedicated to celebrating my grandfather, who had just gotten his medal for 70 year anniversary since serving in the war. As you press your mouse across the screen stars will follow, and if you double click you can erase part of the drawing.

Looking Onwards: 09

I really like the artist Roman Bratschi in LO 5 by hollyl, The colorful rendering really caught my eye and after reading holly’s interpretation, it made me appreciate this work a lot more. I am also really intrigued by the textures and materiality shown in this composition as well as how the combination of color and patterns can make the imagery realistic and quite surreal at the same time. I think what’s really interesting about Bratschi’s work is that while the texture is rendered in such detail and portrays how we see the material in real life, the wax-like texture almost looks like it is only possible in a digital reality.

Roman Bratschi on Behance

Project-09-Portrait

sketch
let foto;

function preload() {
  foto = loadImage('https://i.imgur.com/3x32MS3.jpg?1'); //load image
  tak = ["dickson"] //establish text
  print(tak) 
}

function setup() {
  createCanvas(480, 480);
  imageMode(CENTER); 
  noStroke();
  background(240);
  foto.loadPixels();
}

function draw() {
  let x = floor(random(foto.width)); //randomizing where text pops up
  let y = floor(random(foto.height));
  let pic = foto.get(x, y); 
  fill(pic, 128); //setting color to match the photo
  textSize(mouseX/30) //adjustable text size
  text(tak,x,y)

}

function mousePressed() {
    filter(GRAY) //click to turn photo black and white
}

I thought it would be interesting to literally build myself with my name. This strongly resembles my identity through my name and my face. With each click, color is drawn away from the drawing to slowly resemble me as just a face and less of a person.

LO-9

I found Isabel’s post on Ethmode’s solution to reducing material waste in the fashion industry super innovative. With rendering technology skyrocketing in recent years, it seems relatively intuitive to bring fashion design to the virtual world. This goes to show how applicable rendering technology is to our modern world. There are so many ways rendering can help how we think about design. There are now companies that render sites for construction and architectural designs for clients to experience the place in virtual reality. This is an ideal solution to help clients visit sites without being there due to travel restrictions from COVID. The ways rendering can help us is essentially limitless.

LO 09

Iris Yip iyip
15-104 Section D
LO-09

This week for Looking Outwards, I looked at Tian’s Looking Outwards post on Stamen Studios as part of interactive informational visualizations. He specifically looked at a map looking at bird populations in National parks. I wanted to write and learn about this topic this week because I felt that he really communicated the importance of interactive maps that allows people to look at an issue from multiple perspectives simultaneously. I think it’s a good way of communicating massive amounts of information with lots of different variables. I think that this particular example definitely helps support how interactive data can achieve ways of communication through motion and interaction that regular types of data visualization might not be able to do.

Another project I was drawn to from the same studio/artist is PENNY, which uses an AI to estimate wealth levels of an area based on images. While not entirely guaranteed to be accurate, there is a lot of consideration for signs of areas of low socio-economic welfare in a more directed way. I think that overall, there is a lot of consideration for the unique ways that digital information can be uniquely synthesized and used to make data collection and analysis easier.

LO – 09

The piece I looked at was Ryoji Ikeda’s 2013 work titled Test Pattern, originally posted by Maggie. The piece is a large-scale sound and light installation. It features huge flashing barcodes being projected onto the floor (and, in some versions, walls) of a large open space accompanied by loud rhythmic sounds. 

In the post Maggie goes into detail about the physical setup of the space and the code behind it. To add to this, I will say this piece initially caught my attention because of the scale. I am always drawn to huge installations like this, especially one’s as sensorily intense. I also love the abstract and interpretable quality of the piece. Ikeda is purposely vague about the meaning behind the work, stating, “I don’t really want to speak about any concepts. Because there are no concepts.”

Test Pattern 100m Version at Ruhrtriennale