The project I choose to look into was one called “Phantom Terrains”. This 2014 project aimed to capture the wireless data that consistently streams from routers to our devices and then use an algorithm to transform it into almost music-like audio. Not only does this project allow us to more fully understand the plethora of data that is constantly streaming around us, but it also does so in a beautiful and elegant way by combing the audio and visual representations of these sound waves. I cannot begin to comprehend the amount of complex coding that was required to create such an algorithm, and I sincerely admire how the creators, Frank Swain and Daniel Jones, were able to apply it in such a beautiful way. Daniel Jones, specifically, has been working on combing art, sound, and technology to give us a better understanding of the world we live in.
Phantom Terrains was developed using Nesta funds from the U.K. and can be interacted with by using bluetooth hearing aids (along with being sampled on their website). This implementation of the hearing aids sprouted from the idea of re-working this prosthetic technology into enhancement tech, allowing the user to be able to hear a wider range of sounds than the average human ear ever could.
This project is vastly intriguing and has piqued my wonder in terms of just how much invisible data is swirling around us at all times and just how much coding can do to help us to visually show that.
Link to the website: http://phantomterrains.com/