The project that I decided to look at was Expressions, by Kynd and Yu Miyashita. I found the video to be captivatingly intricate, and I was shocked when I read that it was all digitally rendered. I was even more shocked when I found out it was rendered in 2D, using various layering and shading techniques as opposed to 3D vectors.
The images themselves were really cool, but the soundscape that went with it really elevated the experience. A brief warning if you are going to watch it, make sure to turn your sound down. I got blasted with an intense high pitched screech in my ear right off the bat (which was cool… but scary). I thought the interactivity between the sound and visuals was really compelling.
When reading about the piece at: https://www.creativeapplications.net/sound/expressions-paint-and-pixel-matiere-at-micro-scale/
I learned that the developer used WebGL which is a JavaScript api, made to render 2D and 3D graphics, and then went in on TouchDesigner to add detail. When looking at WebGL samples, I found that they are very similar to what we are doing now. I also know that we can use WebGL in p5.js, so maybe they’re the same thing? It was hard to find information on it. But anyways, it was interesting to see what I could do in the future if I keep up with programming.