The Computer Orchestra is an interactive orchestra consisting of multiple computers. It was created by fragment.in, and the goal was to let the user conduct their own orchestra and music. The conductor’s hand movements are accurately recognized using an Xbox Kinect motion controller that is connected to a central computer. Instructions are given to many musician screens. Screen-musicians then send the sound to the conductor and produces visual feedback.
What I love most about the Computer Orchestra is that it crowdsources sounds that people can upload, and then the musician can access it and play it. It’s incredible to see that one person can control the music through simple hand motions and gestures. The simple interface of the centralized computer also makes it extremely easy for the conductor to change where he wants vocals, violin, etc.
To learn more about the Computer Orchestra, click the link below:
For this Looking Outwards, I wanted to focus on a new computational instrument. The Midi Fighter 64 is a ‘finger drum’ instrument, in which a user can program sounds into each button and play the instrument by pressing the buttons. There’s a wide range of button numbers, from 4 x 4 (16 buttons) to 8 x 8 (64 buttons). Artists who use these instruments are called controlerism artists because the boards are closely related to video game controllers (Midi Fighters are only used for music). Another notable similarity between these instruments are computer games is that the buttons on the Midi Fighter are the same as retro Japanese arcade buttons.
The Midi Fighter sounds are programmed into the board using Ableton Live, a DAW (Digital Audio Workstation). The Midi Fighter was originally created by Ean Golden, who’s been interested in controlerism music since the early 2000s. Golden wrote and published an article on the topic in 2007 called ‘Music Manuevers’: https://archive.moldover.com/press/Moldover_Remix_Oct-2007_w.jpg . The instrument has since been popularized by artists such as Shawn Wasabi, a DJ who has pushed the instrument to its limits and played a role in its development to turn it into a marketable product.
Kraftwerk, an electronic band, created a The Robots electronic music performance in 2009. Kraftwerk was established by classical musicians who wanted to mix sound, feedback and rhythm to create music.
The video depicts electronic music with robots on stage moving along in set patterns to the music. I admire that it has a “concert” feel despite not having a singer. The performance includes music, lights, a stage, and people. Although I do wish that the robots moved to the beat of the music or maybe at a faster pace. The slow movements of the robots don’t match the upbeat fast-paced music.
I don’t know anything about the algorithms about how the work was generated. I also don’t want to suppose anything because I really have no clue and it would be wrong to generalize and guess based on no knowledge.
Latetia Sonami is a sound artist and a performer based in San Francisco. The work that I will be discussing is called “Lady’s Glove,” an instrument that makes and manipulates sound in a live performance. The sensor within the glove measures the motion, speed, and proximity, sending the data into Sonami’s computer and thus creating music. This glove is will never make the same sound unless one replicates a completely same motion, meaning even Sonami might not know what the music will sound like until it actually happens. In that sense, I admire her artistic sensibility and knowledge on what to do to make the sound pleasing, especially in live performance settings. I am unfortunately unsure what algorithm she used to create music that would sense her motion. But I admire this project because it questions the definition of music and takes the concept of computational music to another level.
A computational music project that I found inspiring was the “Weather Thingy”by Filip Visnjic. The project was mainly composed of two parts: one being a weather station and other being a controller. The basic mechanics of the project was that it would gauge wind and rain levels with its sensors. And then, the controller had receptors that could translate such weather data into audio effects, after interpretation with built-in instruments. The controller also had screens where the artist can amplify or constrain sounds.
This project was inspiring that it used sounds from nature to recreate music. Ironically, Filip uses a computer software to interpret sounds such as rain, wind, and thunder. This project is incredible in that it gives musical artists various novel sounds effects to work with. Filip also gave the machine the ability to save certain sounds to later give musicians inspiration.
The “Weather Thingy” uses various software such as C++, Arduino, and MIDI protocol.
Sonic Arcade: Shaping Space with Design is a multi-component exhibition featuring interactive installations that experiment with the computations of Sound Design. The exhibition showcases several solo / collaborative works that, in one way or another, helps the audience feel more integrated into the spatial environment. The work utilizes electronic circuits, signals, radio waves, and resonant bodies to create these immersive experiences.
Though all these pieces are drastically different from each other, each utilize sound as a substance as the primary medium in each installation. In the exhibition above, Studio PSK uses body-activated MIDI sensors to detect when sounds should be triggered / altered. With these sensors installed throughout all the structures, the entire exhibition becomes a musical instrument itself, ultimately allowing viewers to both watch and participate in the art.
GE Wang is a professor for Computer Research in Music and Acoustics at Stanford. His research focuses on interactive design in music in collaboration with programming languages. He created Smule and Magic Piano, both for the Iphone. I was interested in this piece, Twilight, because it is both a musical performance and an art performance, and integrates what GE Wang is focused on, which is interactive design in music. Using their laptops, the orchestra is able to translate thir body movements into sound/pitch, and because of that creates a visual performance as well. Something interesting with which they used the laptop seemed to mostly be biased towards gradual changes in sound, and slow build up of music, which is interesting. The algorithm used to create the music and select the pitch is definitely reliant on the length of the string attached to their wrists that the performers use.
Short Biography of Angela Washko: Angela Washko has a BFA in painting/drawing/sculpture and an MFA in Visual Art. She currently works as a Visiting Assistant Professor of Art at Carnegie Mellon. Broadly speaking, her work focuses on feminist issues, “creating new forums for discussions of feminism in spaces frequently hostile toward it.” For example, she has operated The Council on Gender Sensitivity and Behavioral Awareness in World of Warcraft since 2012.
“The Game” by Angela Washko is a dating simulator video game about pick-up artists, where the player is a woman being aggressively pursued by 6 men that attempt to. The dialogue is the strongest part of the game, but the rough graphics, almost horror-like close-ups, and intense music add to the disturbing quality of the experiences.
The player can choose between several responses that range from positive, where you accept and play into the PUA’s techniques, and negative, where you rebuke the PUA’s advances.
I really like the game as it is now. I think Washko could lean in further to the ‘dating simulator’ aspect. Right now it presents the choices in a fairly equal and straightforward manner, so most may just reflexively choose the options where you refuse the PUA’s, but adding an in-game scoreboard/consequences to each choice (for example, if you act ‘rudely’ and refuse a man you lose ‘social standing’) would add to the suppressive awkwardness of the exchanges and perhaps make the player feel more self-conscious of choosing the options where you don’t play along.
LittleBits was created by Ayah Bdeir in 2010. Specifically, littleBits is a collection of modular electronics that snap together in order to make prototyping more efficient and easy. I admire this project because, being able to work with littleBits in one of my studio classes, I greatly appreciated the ease of being able to create and mimic low-fidelity electronic devices. Not having much knowledge on how certain devices work, I also enjoyed how I did not have to know much in order to work with littleBits, and I was also able to learn more about the general pattern of certain electronic devices along the way, using littleBits. It was a way of learning that is both hands-on and simplified enough as to not overwhelm the user.
Bdeir graduated from the American University of Beirut in 2004 in Computer and Communications Engineering, as well as Sociology. Her work has appeared in The New Museum, the Royal College of Art, and has taught at both NYU and Parsons. She is both an interactive artist and an engineer, founding littleBits, which joined MoMA’s permanent collection and has been partnered with industry companies like Disney, Pearson, and the New York Department of Education. Bdeir has led many initiatives to get young girls involved in STEM, partnering with the White House, and companies such as Disney, in order to do so. While she is originally from Beirut, she now resides in New York City.
//Taisei Manheim
//Section C
//tmanheim@andrew.cmu.edu
//Assignment-10
var trees = [];
var frames;
function preload() {
//background gradient
backgroundImage = loadImage("https://i.imgur.com/L0VpcqE.jpg")
//frames for person animation
frames = [];
frames[0] = loadImage("http://i.imgur.com/svA3cqA.png");
frames[1] = loadImage("http://i.imgur.com/jV3FsVQ.png");
frames[2] = loadImage("http://i.imgur.com/IgQDmRK.png");
frames[3] = loadImage("http://i.imgur.com/kmVGuo9.png");
frames[4] = loadImage("http://i.imgur.com/jcMNeGq.png");
frames[5] = loadImage("http://i.imgur.com/ttJGwkt.png");
frames[6] = loadImage("http://i.imgur.com/9tL5TRr.png");
frames[7] = loadImage("http://i.imgur.com/IYn7mIB.png");
}
function setup() {
createCanvas(480, 480);
// create an initial collection of trees
for (var i = 0; i < 10; i++){
var rx = random(width);
trees[i] = makeTree(rx);
}
frameRate(10);
}
function draw() {
image(backgroundImage, 0, 0, width * 2, height);
mountain();
mountain2();
//ground
fill(210,218,255);
rect(-1, height-50, width + 1 , 50)
updateAndDisplayTrees();
removeTrees();
addNewTrees();
//person on ground
push();
scale(.35, .35);
image(frames[frameCount % 8], width * 2.75, height * 2.33);
pop();
}
//upper mountain
function mountain() {
var speed = 0.0005;
var terrain = 0.01;
stroke(70,119,187);
for (var x = 0; x < width; x += 1) {
var t = (x * terrain) + (millis() * speed);
var y = map(noise(t), 0, 1, 0 + 100, height / 2 + 100);
line(x, y, x, height);
}
//person on mountain
push();
scale(.10, .10);
image(frames[frameCount % 8], width * 9.85, y * 10 - 100);
pop();
}
//lower mountain
function mountain2() {
var speed = 0.0003;
var terrain = 0.005;
stroke(50,99,167);
for (var x = 0; x < width; x += 1) {
var t = (x * terrain) + (millis() * speed);
var y = map(noise(t), 0, 1, height / 2 + 150, height / 4 + 150);
line(x, y, x, height);
}
//person on mountain
push();
scale(.25, .25);
image(frames[frameCount % 8], width * 3.9, y * 4 - 110);
pop();
}
function updateAndDisplayTrees(){
// Update the tree's positions, and display them.
for (var i = 0; i < trees.length; i++){
trees[i].move();
trees[i].display();
}
}
function removeTrees(){
// Copy all the trees we want to keep into a new array.
var treesToKeep = [];
for (var i = 0; i < trees.length; i++){
if (trees[i].x + trees[i].treeWidth > 0) {
treesToKeep.push(trees[i]);
}
}
trees = treesToKeep; // remember the surviving trees
}
function addNewTrees() {
// With a very tiny probability, add a new tree to the end.
var newTreeLikelihood = 0.05;
if (random(0,1) < newTreeLikelihood) {
trees.push(makeTree(width));
}
}
// method to update position of tree every frame
function treeMove() {
this.x += this.speed;
}
// draw the tree
function treeDisplay() {
//tree leaves
fill(22,138,130);
noStroke();
push();
translate(this.x, height - 60);
triangle(0, -this.treeHeight, 0 - this.treeWidth / 2, 0, 0 + this.treeWidth / 2, 0)
pop();
//tree trunk
fill(40,59,107);
push();
translate(this.x, height - 60);
rect(- 2.5, 0, 5, 10);
pop();
}
function makeTree(birthLocationX) {
var tr = {x: birthLocationX,
treeWidth: random(20,30),
speed: -5.0,
treeHeight: random(30,60),
move: treeMove,
display: treeDisplay}
return tr;
}
For this project I spent some time messing around with different colors and mountain heights in order to get a look that I liked. I couldn’t get the sky to gradient in a way that I thought looked good so I used an image to create the gradient. The trees are at random heights and come at random intervals. The hardest part was to get the racing people on the right to run along the mountains rather than at a consistent y-value. I had the people decrease in size in order to give a sense of depth, but it was difficult to control the movements of the people once they were scaled down. Overall, I am pretty happy with this project.