Algorithmic Drive – Spectacular car(au)tonomy

Sound always surrounds us no matter where we are – in our rooms, on the boardwalk by the ocean, in class during final exams when everyone is focused on their work. Even during complete silence there’s a sound of our breathing or the sound of dust falling on the floor. Sound is always there and the idea of music always follows it. Many would consider a loud street in the middle of New York City to be noisy or annoying but, as a native New Yorker, I grew to hear music in the random Manhattan sounds. We can hear constant beats from honking cars or from people throwing out metal cans of soda that make a clunking sound. People’s conversations are lyrics about daily life, struggles and human routines. Music is everywhere and I think it’s beautiful. 

I chose a project called “Algorithmic Drive” created by François Quévillon. “The work plays with the tension generated by confronting the technologies used by mobile robotics with the unpredictable nature of the world.” What I admire about this project is the fact that both technology and nature are connected here because robotics were used to capture the sound of the world around us. It’s fascinating. What I know about the algorithms that generated the work is that Quévillon built his own database that consisted of camera recordings and he connected that database to his car’s on-board computer. The camera was recording videos via Bluetooth in the car’s OBD-II port of a surrounding nature and it took into consideration all of the factors such as altitude, orientation, car’s location and speed, engine RPM, stability and the temperature of various sensors. To sort all of the videos statistically from minimum to maximum value by the parameters like sounds, images, location and car’s activity, there was a specific sampling system that used signal processing, data analysis and computer vision algorithms. The parameters were “mapped using a Uniform Manifold Approximation and Projection (UMAP) dimension reduction technique.” Sound analysis software was used to get the sound and visual features of the environment based on OpenCV and road scene segmentation using the SegNet deep convolutional encoder-decoder architecture and its Model Uncertainty. Moreover, the system that François Quévillon used in his project had  “a custom-built interface with illuminated rotary encoders and a monitor installed on a road case that contains a subwoofer.”

A way the creator’s artistic sensibilities manifested in the final form is the fact that François Quévillon was able to bring life into technology: robotics were used to record videos and he managed to add sounds based on recordings of nature. It’s simply fantastic. We were able to hear music based on ocean’s waves and on trees that were growing on the side of the road. We were able to hear beats from rain drops and we were able to hear melodies based on the flying insects. This project brought together technology and nature to create music – this project shows us that technology can be used not only for practical purposes but also for spirituality. 

François Quévillon, ‘Algorithmic Drive’ , 2018

Leave a Reply