Caroline Record is an artist and software developer with a strong background in the fields of fine arts, user-centered design, and programming. Her project, Light Clock, was installed at the Carnegie Museum of Art. Every five minutes, the camera within the clock captures a 360-degree view of the museum plaza. It does this 24/7. These images are then sent inside to an exhibit in the museum lobby, allowing visitors to spin to control their point of view of the photographs.
Record’s exhibit allows visitors to experience the perspective of another time, from another point of view. It can be interesting to see how the view outside is compounded by experiencing the same view inside.
Caroline Record has had fellowships in the past with Carnegie Mellon University, the Brewhouse Association, and Yale University Norfolk. She has had exhibitions at the Carnegie Science Center, Space Gallery, Miller Gallery, and the Brewhouse Association. She now works as part of the innovation team at the Carnegie Museum of Art. I admire her work because her pieces seem to be about turning mundane things into something that inspires curiosity and wonder. In other projects, Section of Mystery, and She, Record makes use of simply a door and text messages. I appreciate the art of her creating worlds within the world we live in.
A artist I decided to talk about Joel Hunt. He is a composer that uses algorithmic computer music and electroacoustic in his performances. He is famous all over the world and his compositions have been performed at music festivals including the Conferences in (Athens, Greece), New York City Electronic Music Festival, Electronic Music Midwest (Kansas City), Primavera Festival of Contemporary Arts and Digital Media (Santa Barbara), and the California Electronic Music Ex-change Concert Series (Los Angeles). Joel is a Lectur-er in Music and Digital Media, Arts, and Technology at Penn State Behrend. He uses his phone which is attached to the instrument to create unique sounds in his performances. It is very unique and I didn’t think that one could use their phone as well as the instrument together to make music. It is very impressive and it is continuing to gain attraction, making it one of the main things that got him famous.
Urban Lights Contacts is an interactive installation the senses varying degrees of electrostatic contact according to how close people’s bodies are. The only way to activate this work is to make multiple people contact each other’s skin. For example, a person put his or her hands onto the interactive shiny sphere, but him or her alone won’t create any sound. He or she must make another come and touch his or her hand to make the installation activate. This installation encourages people to touch and come close together to create a new sound at the moment. The sensory of the work is tactile, creates light, and also emits sound. It encourages interactive stagings of the public’s bodies, which basically makes the people turn into a “human instrument.” I am admired by this installation because it brings together the public for a unique art and technology experience. Also, I find it interesting that this work creates unpredictable relationships between the public who interacts with the installation.
In 2018, Japanese sound artist Ryoji Ikeda created an audiovisual “Code-Verse”l that took computer graphics and translated them into electronic noises and drones, coining it as “code-verse.” Ikeda created this code-verse project after creating his own type of techno music that was formed from sonic textures from graphics. The code probably was composed of a series of sonic partnership with the direction and speed of the graphics from Code-Verse. I found this project very interesting because it was a balanced intersection of two forms of entertainment– visuals and audio. By conjoining the two mediums, Ikeda created a mesmerizing audiovisual that allowed viewers to feel as if they were placed in a new dimension. Ikeda sought to create a unique art form that escaped from the media-infested society we live in and a form that allowed viewers to feel as if they were interacting and in the actual art environment, and this was manifested in Code-Verse which allowed visitors to feel as if they were in a different dimension from the Big Data world that we live in.
The Weather Thingy is an instrument that takes into account the weather to play various sounds in harmony. I found the serenity produced by this project fascinating. Different sensors bring the numerical input to be translated into the algorithm and produce sound. The ambience of the sound and the concept of weather go along very well together.
This project makes me wonder the very extreme cases and different possible variables. For instance, when it is hailing, what will happen? Or when the climate is very arid and hot? I think this project has a lot of potential and room to explore.
Computer Music —— The Emerging Art of Algorithmic Music
Ville-Matias Heikkila, a Finnish artist and computer programmer, has been experimenting with algorithmic music with the help of computer programs. He proposed that we need to not only listen to the music, but also be able to visualize it to enhance the impact that music can bring to us.
Heikkila says that sometime, codes and algorithms can generate surprisingly interesting music by repeating only two or three arithmetic operations. Therefore, he is really interested in creating audio and visual artworks with simple programs but in a rather disorganized way.
For this week’s Looking Outwards I decided to focus on Imogen Heap, a British singer-songwriter and audio engineer. I really admire her work as an artist because she has taken her musical vision beyond just creating music but has experimented with new ways in which to create. One of the projects she is best known for is the Mi Mu gloves, which use mapping technology to transfer the movement of the hands into musical compositions. In the video above, Heap explains many of the different uses for the gloves and the different movements that create changes in pitch, filter of the sound, and many other elements. I find the product to be really interesting and it adds a really cool dynamic to the way an artist can perform on stage. Heap has talked a lot about the fact that she hated having to have so much equipment on stage in order to create the kind of music she wanted to perform. The gloves give her the ability to not be locked down in a location and create a music experience that envelops both sound and movement.
While I dont know too much about the details of the programming, the gloves use a network of bendable sensors that track the movement of the hand and fingers. This in addition to create an invisible map of the users space allows the software to recognize shifts in motion and attribute those to different music elements and sound qualities.
Overall I just find this to be a really intriguing project. I think that Heap’s vision toward the future of music and performance is really interesting because I dont think its something we see from a lot of musical artists. This enables the user to create a wide variety of new musical experiences and its really interesting to see how work like this will develop into the future.
“I am using 1 of my grace days for this late
Anders Lind is a great Swedish artist who actively uses computation to compose and perform music. Also a creative director at the Umea University, he composes music mainly for orchestras, choirs, and various ensembles and solo performers.
For this blog post, I was mainly interested in his 2016 exhibition, Lines, which brings musical experience to the field of electronics or interactive technology. In this experimental exhibition, the lines are attached to either wall, floor or hung to ceiling with sensors that create pitch of sound when interrupted by human body. The scale of the exhibition also encourages the visitor to be creative about music making and corporate with others to create musical harmony.
No musical background is required to enjoy the program. The video also shows different group of people interacting, including children.
I am using a Looking Outwards grace day for this assignment.
In 2013, professor Francisco Vico of the University of Malaga created Iamus, a computer that can generate classical music scores at the touch of a button. According to Vico, Iamus’s scores become increasingly more complex as they evolve throughout their duration, giving them a dynamic flow beyond a random progression of notes. The algorithm behind Iamus is inspired by human biological processes, and then a human selects from the pieces Iamus provides. This work is admirable because it is ground breaking, introducing artificial intelligence to the world of art and music in a new way. It is very interesting to see the progression of these technologies, and Iamus is just the beginning of a new era in the world of music.
On the topic of computer-generated music, I found this album by David Cope titled Classical Music Composed by Computer: Experiments in Musical Intelligence created in 1997 A professor from the University of California at Santa Cruz, he started as a trained musician but found a keen interest in the world of computing as it rose in popularity. David Cope realized how his musical approached paralleled that of programing. At that point in his life, he discovered the opportunity to explore where music meets computing. Eventually, Cope decided to create a program called the Experiments in Music Intelligence. The program would generate music based on data collected by various scores and even Cope’s music. In the album Classical Music Composed by Computer: Experiments in Musical Intelligence, the program generated the sheet music that the musicians would play from that would be recorded. What I admire about this work is how it finds a balance between human-made sound and generative computing. At its final stage, the song is made by an instrument, but the original song is derived from a computer program.