KadeStewart-LookingOutwards-11

Proteus cover

Proteus is an indie video game that came out in 2013, focused on exploration with an emphasis on “nonviolence”. The soundtrack is written to reflect the natural beauty within the game, which is deterministic. However, the music that reaches the player’s ears is non-deterministic because it is influenced by the player’s environment and their interactions with the environment. For example, when the player is in a dense environment, the sound is very dense. When the player is walking, the sounds behave as they do when you yourself are walking.

The video game, as stated above, is intended to reflect a nonviolent existence. The soothing music plays into the message, and the exploratory theme is emphasized by the player’s active role in how the music sounds. I think that this is a very basic but incredibly powerful method of getting the designer’s message across.

Interactive Proteus Music

Jamie Dorst Looking Outward 11

For this week’s looking outward, I am writing about a project done by Pierry Jacquillard, Prélude in ACGT – Sonification of personal (DNA) data, where he converted his own DNA into a musical piece. He wanted to see how nature’s core structure (DNA) could collide with the artificial and man-made (Code). He created five interfaces to help achieve this project. Two of them drive the remote, which allows you to change parameters like Tempo, the musical arrangement or even the type of conversion and the chromosomes’ library where you can choose which one to play and where inside it. The three other are used to visualize the sound, the type of algorithm and his raw DNA – all in search for understanding the process.

Some images of the setup of his project

This project was really interesting to me because it’s something I probably never would have thought about, but it is actually a really creative idea as a way to compare natural and synthetic things. I’m also very surprised at how the music doesn’t sound too eclectic or random–it sounds very well like a contemporary piece that could have been made without the DNA.

Shirley Chen-Looking Outward-11 Computer Music

This project is a collection of graphics generated by computer based on the music by visual designer Cyrill Studer. In this project, the graphics are generating and transforming under the influence of music pieces in a subtle but clever and engaging way. This collections of graphics become the music video for this song. I think this is very fascinating for me that they utilize computing as a tool to visualize the music. It allows the viewers to not only experience the music by ear by also by the visual effect. The representation is very direct and related to the common perception, which allows the precise depiction of the music. The visual concept of the entire music video is based on a single form: the ellipse. Through the variation in angles, distortion, arrangement and number of the ellipses, they achieve the visual effect that is closely representation the music in a commonly understandable way and language.

The graphics were generated in Processing, manually controlled and performed with a midi controller and recorded through the Syphon with the Syphon Recorder.

Music Video – Baby Behold by CARVEL

Generated Graphics Based On Music

SOURCE:

https://www.creativeapplications.net/processing/carvel-baby-behold-music-video-by-cyrill-studer/

Looking Outwards 11 rrandell

https://www.creativeapplications.net/js/prelude-in-acgt-sonification-of-personal-dna-data/

This is a link to the artists work and a clip of his piece ‘Prelude in ACGT’ and below is a photo of his physical manifestation of the work

This Looking outwards is about artist Pierry Jaquillard. I would consider his piece ‘Prelude in ACGT’ sound art and not music, but there certainly is a musical aspect to his work. This piece combines sound and biology in a rather unique way. He examined his own personal DNA and tried to explore it through coding and then make something musical from this exploration. To create sound out of DNA, he coded 5 interfaces that allow certain factors to change. One of the interfaces allows you to access his chromosome library and chose a ‘piece’ of it to play. 3 of the interfaces actually examine the DNA and visualize sound in tandem with his raw DNA. Pierry uses a midi library JavaScript to generate midi signals those signals are then sent into Ableton live to actually generate electronic sounds which is then exported, stored, and translated into sheet music. I am very inspired by his interest and drive to create an intersection with these two fields of interest.

Tanvi Harkare – Looking Outwards 11

Prelude in AGCT is a project created by Pierry Jaquillard at ECAL Media and Interaction Design Unit. The project takes a person’s DNA and turns it into music notes. As of now the project only has Pierry’s DNA. It includes all 23 chromosomes, which are run through different interfaces in order to create different results. Everything is processed through JavaScript scripts. The DNA helps Pierry to visualize his DNA as sound. You can view the data on any digital device, such as an iPad. Pierry uses a midi file to generate signals to a computer that plays the file. Certain aspects of the music track can be changed, such as the tempo, arrangement, instruments, etc.

I find this project interesting because of how each person can create a different soundtrack because each individual has their own unique set of DNA. I wish there was an easier way for users to get a soundtrack unique to them – perhaps of their facial or body structure, which is something else that is unique to everyone.

Converting DNA into midi files onto a digital device

Curran Zhang- Looking Outwards 11

For this week’s post, I decided to investigate the work of Andrius Sarapovas. By converting, phone data into a numerical data set, music can be produced with the room-sized kinetic sculpture. Different segments that compose of metal bar, sound activator, sound damper, resonator, and mechatronics are placed on a surface or hung from the ceiling. With access to Tele2’s 4G network, various algorithms are used to generate music that covers 16 notes: C, D, F, and G along four octaves. As visitors of the art piece walk around the room, different harmonics are composed at different times and locations. In order to create an algorithm for the music to play, extremes of the 4G data of one second is used to create one second of music. Numbers derived from the extremes help determine the rhythm and volume.

Installation that is hung on the wall and from the ceiling

The work done by Sarapovas is to help bridge the two aspects of chaos and structure. Like the ideas of smart devices, his installation must be “smart” and be ever growing and changing with the flow of the internet. This way of expression is very interesting as it helps bring two ideas together and mesh it into a coherent art that can be absorbed by its visitors.

Visitors of all age express interest in the piece of art

https://creators.vice.com/en_us/article/7x9m3a/massive-robotic-instrument-smartphone-data-sounds

 

Elena Deng-Looking Outwards 11

The Cycling Wheel

This week I decided to look at Keith Lam, Seth Hon and Alex Lai’s The Cycling Wheel as the subject of my Looking Outwards. With this project the three designers utilized Arduino as well as other processing softwares to make the bicycle an instrument of light and sound. By turning the wheel of the bicycle turning different aspects such as the music and light bean and color of the light would change/be altered. The bike itself becomes an instrument and the controllers of the wheel become the musician.
LED Strip and Control of the Wheel

With this project I admire how it allows anyone to become a musician. From prior experience of the Arduino program I am assuming that they were able to alter the color of the LED strip though the influence of the motion of the wheel. The creator’s artistic sensibilities manifest itself through the use of the different color as well as the placement of the actual bike wheels.

Cycling Wheel: The Orchestra – Reimagining Marcel’s bicycle wheel as a light+sound Instrument

Yingying Yan- Project 10- Landscape

sketch

/*
Yingying Yan
Section E
yingyiny@andrew.cmu.edu
Project - 10
*/

var snowman = [];

function setup() {
	createCanvas(480, 240);
	for (var i = 0; i < 4; i++) {
		var rx = random(width);
		snowman[i] = makeSnowman(rx);
	}
	frameRate(10);
}

function draw() {
	background("green");
	//background
	displayHorizon();
	//snowman
	updateAndDisplaySnowman();
	removeSnowmanThatHaveSlippedOutOfView();
	addNewSnowmanWithSomeRandomProbability();
}

function updateAndDisplaySnowman() {
	for(var i = 0; i < snowman.length; i++) {
		snowman[i].move();
		snowman[i].display();
	}
}

function removeSnowmanThatHaveSlippedOutOfView() {
	var snowmanKeep = [];
	for (var i = 0; i < snowman.length; i++) {
		if(snowman[i].x + 50 > 0) {
			snowmanKeep.push(snowman[i]);
		}
	}
	snowman = snowmanKeep;
}

function addNewSnowmanWithSomeRandomProbability() {
	var newSnowmanPercent = 0.006
	if (random(0,1) < newSnowmanPercent) {
		snowman.push(makeSnowman(width))
	}
}
//move towards the left 

function snowmanMove() {
	this.x += this.speed;
}
//function that draws the snowman

function snowmanDisplay() {
	push();
	fill(255);
	noStroke();
	var sizeBottom = 35;
	var sizeMiddle = 25;
	var sizeTop = 20;
	var yy = height-35;
	//translate(this.x, height - 35);
	translate(this.x, 0);
	//bottom circle
	ellipse(0, yy - sizeBottom / 2, sizeBottom, sizeBottom);
	//middle circle
	ellipse(0, yy - sizeBottom - sizeMiddle / 2 +5 , sizeMiddle, sizeMiddle);
	// //top circle
	// ellipse(0, yy - sizeBottom - sizeMiddle - sizeTop / 2 + 10, sizeTop, sizeTop);
	push();
	fill(0);
	ellipse(0 - 5, yy - sizeBottom - sizeMiddle / 2 + 2, 2, 2);
	ellipse(0 + 5, yy - sizeBottom - sizeMiddle / 2 + 2, 2, 2);
	noFill();
	stroke(0)
	ellipse(0, yy - sizeBottom - sizeMiddle / 2 + 5, 4, 4);
	line(15, yy - sizeBottom / 2, 30, yy - 40);
	line(-15, yy - sizeBottom / 2, -30, yy - 40);
	pop();	
	pop();
}

//define all the objects and variables

function makeSnowman(positionOg) {
	var sman = {
		x: positionOg,
		//y: 380,
		speed: -1.0,
		move: snowmanMove,
		display: snowmanDisplay
	}
	return sman;
}

//background
function displayHorizon() {
	fill("lightblue");
	rect(-1,-1, width + 1, height - 39);
}

add cap

I wanted to render some snow scene because I love the snow. Unfortunately, I can barely finish the project. But I have a snowman! I mean lots of snowmen. I think this project is hard and really makes me think about “object”. I am still in the process of understanding the code. I think for my final project if I would do something similar, it will be much better than this.

Sophia Kim – Looking Outwards 11 – Sec C

I am taking the option of using my Looking Outwards 04, focusing on the sound design of the project. “Apparatum” was created by two different teams. The “Symphony – electronic music” was composed by Boguslaw Schaeffer, and the produced analogue sounds were created by the panGenerator team.

Compared to regular music, which is prerecorded and planned, sound art uses various beats that gives texture to art, specifically installations,  conceptual art, and exhibits. Like I said in Looking Outwards 04, I really liked how they fused communication, product, and sound design all together. With the sound design, the aesthetic design of the entire project is very well done. The designers of this project used various linear tape samplers as primary mediums. Also, to obtain noise and basic tones, spinning discs with graphic patterns were used, which are installed in the radio.

I really admire how different sound designs can be fused together by the user. Also, I admire how user customization is a big part of this project. The sound patterns the user picks are recorded, played out loud, and then printed on a receipt to document the patterns.

Carley Johnson Looking Outwards 11

The group I came across, iii, is “an artist-run platform supporting radical interdisciplinary practices engaging with image, sound, space and the body.” They do residencies, and support artists, but the specific project I’ll be looking at is a totally immersive installation called “The Intimate Earthquake Archive”. The art piece uses almost every sense through vests and compositions derived from seismic recordings, interactive radio broadcasts, and sandstone earth core samples and wooden scaffolding set up around the people inside.

In this photo you can see the scaffolding and the vests worn by participants.

This project is really interesting because it plays with sound in so many ways. There are radio broadcasts as well as recordings of an Earthquake in Groningen, but the vest are the most interesting. Based on movement and position in the space, they omit sounds and rumbles that affect different parts of the body. I like how their website describes these tactile vests: “allows the wearer to explore the subtle rumbles of the earthquakes on the body.”

The truth about sound is we love it- we love music- and there is no doubt that what we listen to affects the state of our body. But often, this is not an idea explored in relation to art or so firmly attached to what we feel. I would love to wear one of these vests. Feeling and hearing the soft rumble of an earthquake in my stomach as well as all around me sounds at once terrifying and calming.

I have to suppose that the algorithms used employ motion capture graphics, so as to track the wearers progress through the Earthquake, and possibly some complex math in the transducer speakers inside the vest to know when and wear to trigger a rumble.