LookingOutwards-04 SectionE

Cycling Wheel: The Orchestra – Reimagining Marcel’s bicycle wheel as a light+sound Instrument created by Keith Lam, Seth Hon, and Alex Lai created October 7,2017

The wheel in general is a soothing object as it follows the shape of the circle which is a strong geometric figure in the design world. A wheel/ shape of a circle is used in many different mechanisms allowing many machines to be invented and working. I was interested in how they took the wheel of a bicycle and decided to make it a sound.  A bicycle’s wheel is used mainly for movement, but it also a satisfying movement. It just kept me wondering in how they would change a subject like a wheel into sounds? The way they included the lights as they played with the sound is visually helpful. The algorithm that is used is primarily concerned with manipulating sound, lights, pitch, volume, and speed of sound all with the movement of the bicycle wheel. They used a combination of programs and tools, which focused on each variable. Due to the combination of the variables they were able to make the wheels act as instruments and with the inclusion of lights it caused it to become almost like a show, it was fitting that they called the project “the orchestra”

.

Liz Maday LO4

 

Sonic Playground (2018) is an interactive installation created by Yuri Suzuki, as part of an initiative to make the space outside the High Museum of Art in Atlanta more playful and engaging. When you speak into these horn like structures, they manipulate sound depending on your position. The sound was developed in Grasshopper using “wave tracing techniques”. The program allows the user to test different geometric shapes, and how they would affect the sound actually being produced.

This project is inspiring to me because I appreciate the way that it suggests that adults are capable of play and wonder in the same way that children are. Although this installation is reminiscent of the child like structure of a playground, it is intriguing to think about all the complicated programming that went into its precise design.

Sonic Playground, by Yuri Suzuki, at the High Museum of Art in Atlanta

Hannah Cai—Looking Outwards—04

Imogen Heap demonstrating her mi.mu gloves

Mi.mu

While looking for stuff on youtube, I came across this video. It caught my eye because I’m a fan of Imogen Heap, and I have to say that after watching this, I’ve become an even bigger fan of her. Using these “mi.mu” gloves developed by Imogen Heap, a person can basically generate music with a flick of the wrist. I’d love to see a dancer wearing them, composing music from choreography, rather than choreographing to music.

The gloves connect to her computer, and change the sound produced based on factors such as position in space, pressure, hand position, etc. Altering the filter and other details basically makes for real-time generated sound. The computational part is deciding how all the variables would influence the sound that’s generated; how pitch, range, and other factors are mapped to 3d space. Since Imogen Heap usually works a lot with free-flowing, unstructured electronic music (check out her song “you know where to find me”), her characteristic style is present in her use of these gloves, and the two really mesh well and augment each other. I know that a few other artists have used these gloves for performances as well (including Ariana Grande), but I doubt that their music would really benefit as much from these gloves as Imogen’s does, due to stylistic differences. All in all, I’m really impressed with how Imogen Heap created what’s basically an amazing instrument of her own, and how she’s using it to enhance and showcase her already distinctive approach to music.

Jisoo Geum – Looking Outwards 04 – Sound Art

Mileece “Nightfall” 2003

Mileece Abson is a sonic artist and an environmental designer who creates sound art using plants. Mileece generates original sound by connecting electrodes to plants that conduce bioelectric emissions. Bioelectric emissions are currents that come from different plants. The electrodes that are attached to leaves then conduct currents and send the information to an amplifier. The amplifier than transfers the currents into codes and transmit the codes into a software that animates the sound.

Her music tends to be extremely subtle and abstract, capturing the movement and growth of nature. I liked the organic and peaceful energy that her sound creates; it felt like I was brought to a different space. Miller’s music style and vision were also very inspiring because of her vision behind the work. As an ambassador of environments and renewable energy, Mileece mirrors her passion by creating sound art that facilitates connections between people and nature.

Sonic Playground

Yuri Suzuki is a sound artist and designer who has created this public art installation comprised of giant playground “talk tubes” in Atlanta, Georgia.

Sculptural playground talk tubes

Immediately, my interest was peaked because of the architectural implications of these forms. Not only have they taken on a truly human scale, but they have transformed this plaza by generating a field of interactive and communicative pieces.

As I began to read about this project, I became even more excited as these forms were driven by software that analyzed how best sound waves traveled through these horns and pipes. The software they used is a 3D modeling software, Rhinoceros, and a parametric software plugin, Grasshopper:

Screenshot of Grasshopper Script, a Rhinoceros software plug-in

In collaboration with the Yuri Suzuki Design Studio, Luca Dellatorre created a plugin for this Grasshopper script that allows a sound source to be captured and sent in a specific direction which helped to optimize the geometries for the pipers, horn, and acoustic mirrors which allow the sound to reflect in different directions. In this way, the horns can capture as much sound as possible and then reflect and spread the noise level as much as possible to the receiver. While Grasshopper is not an acoustic software, due to the wave behavior of sound, it can be simulated and analyzed in similar ways to other physical elements.

For more information on the Yuri Suzuki Design Studio go to: http://yurisuzuki.com/design-studio

Katherine Hua – Looking Outwards-04

Niklas Roy created a generative sound synthesizer in 2010 that he named “Vektron Modular” that acts as a pocket sound performer.  It is a playback machine in which compositions are stored on microcontroller modules. The compositions can then be played. The modules in this are programmed so that the device is able to produce sound through algorithms stored within the modules. This sound synthesizer device interface was inspired by the PlayStation controllers.

Sound modules with algorithms stored within them

Through this algorithmic synthesizer, Niklas Roy is able to explore the binary structures within music and compare different rhythmic patterns and number systems for counting the beat. The user of the synthesizer can alter the parameters of the algorithm producing the sound by moving the joystick around. I think this project of his really peaked my interest because of how there is a visual experience that reflect the movements of the sound through computational algorithms.

Using the joystick to navigate through the computational soundscapes

“Vektron Modular” by Niklas Roy, 2010

Carley Johnson Looking Outwards 04

This week I am inspired by the brief installation piece (already finished with its short run) in Berlin created by onformative called Meandering River. It was up July 28-30. I love how this piece seeks to unite the seemingly opposite feelings of nature versus technology by using sound and image (and algorithms) to show the ebbing and flowing of rivers over time. The algorithm is meant to randomize the patterns of the river, changing them and creating as they might actually exist in real time. The sound then interprets the river’s movement and so also constantly changes to mimic and complement the ever-evolving visual landscape.  In this way, the artist hopped to create a sense of moving through time, getting lost in the movement of a river created by visual art and sound. I would have loved to see this piece, as I can just imagine getting lost and spending hours listening to how the music changes and watching the visuals, knowing that each moment is being generated and changed in real time like a river. It’d be fun to spend some time near an actual rushing body of water and then see this exhibit to judge how they compare.

A photo of the visual detail created by the algorithm, which is displayed over a series of screens.

Anthony Ra – Looking Outwards 04

close up on “Lenses”

“Lenses” is an interactive sound art installation by creative agency, Hush, in which the differently shaped prisms refract light in the direction that the user turns. When twisted on the walled surface, the refraction of the light and its composition is translated to a software and projects sound in real time. This allows a calm duality in light projection and soundscape.

the position of the prisms correlate with the direction in which light emits
user interacting with the installation

The idea from the designers was for an audiovisual installation to reflect the ideas between designers, artists, musicians, and technologists. The end result of this piece is allowing to integrate multiple different fields in crossing boundaries and creating something visually appealing and interacting for the user.

Video of “Lenses”

LO4 – Alexander Chen

When it comes to music and technology, I definitely feel that this is an area where we will be headed in the near future. As a music major myself, this is an area where I definitely feel very strongly about. I think technology is going to play a bigger and bigger role in contemporary music and that is inevitable. However, whether or not that is a good thing is up for debate. I personally believe that it is a very good assistant. Like jazz guitarist, Pat Metheny states, it is definitely a good assistance. However, that being said, I do not believe that computers will be able to completely take over music (like a robot symphony orchestra) as author William Hochberg seems to suggest. This is because there is a level of human connection and feeling that computers will not be able to replicate. Unless there is AI that is advanced enough to “feel” I don’t think computers will ever be able to create real art. However, that being said, computers and algorithms writing music has been around for forever, and in that instance, I think it is slightly different and definitely valid.

https://www.theatlantic.com/entertainment/archive/2014/08/computers-that-compose/374916/

A piece written in the style of Bach.

yinjiet-LookingOutwards-04

Music Visualization – Debussy, Arabesque #1, Piano Solo

Animation created by Stephen Malinowski

This is a classical music piece composed by Claude Debussy. Stephen Malinowski, an American composer, pianist and software engineer, performed and transferred this classical music into a music visualization project. Stephen produces animated graphic scores and a system of colored shapes to represent elements in music. Particularly in this piece, Stephen mainly uses geometries like circles and lines. Each circle represent a note for piano, and the lines are the melody and flow of the music. When the music is playing, the geometries are animated with the rhythm. Stephen turns a music into a piece of computational artwork. With his algorithm system where he took information from MIDI file, he can generate graphical animations for every music. Stephen once used a quote from Eric Isaacson that “What you see is what you get”. Sometimes it is hard for audience to fully experience the music just by hearing. Therefore, with the help of animated graphics, audience can visually interact with the music. I’m inspired by Stephen Malinowski and I think that there should not just be one way to approach to an art pieces. We should be able to use as much senses as we can.