Alexander Chen works at Google Creative Lab exploring his interest in visualizing and augmenting audio and music from human interaction. His goal is to not only discover different ways music works but to also create playful programs that question what can make music: in his opinion, anything, from instruments to subways maps. In his presentation for Eyeo (2017), he displays various projects he created that all create sound differently; some respond to voice input and match it to a song pattern, and another visualized temporal lag in music. The one I found most interesting was one program that, linked to a mini keyboard, would display the colored dots with different transparencies and placements that represented the keys and the intensity of key press or volume. Beyond hearing the music, this program allowed people to see the music almost as a story with notes as strings of characters that interact in a complex dance; as the music gets more complex the characters multiply and move faster but in a pattern clear to those listening. This visualization also helps people hear the music better as they can see which sounds are made of multiple notes and where these notes are placed relative to each other. Another program I found very interesting is called Spectrogram, which is on his website, Chrome Music Lab, (which he designed as a playground of audio). Most programs he places on this website lack labels and encourages children to discover and explore how the audio in each program is manipulated. I think his work allows people of all ages to learn not only how sounds work but to think of it in different, unconventional ways.