For this project, I continued my work on the first project and added a tap for bpm and pose recognition using machine learning and the leap motion controller.
I kept the same overall layout with a video feed being stripped into separate RGB colorplanes and then moving them against each other but instead of having a single looping video I created a playlist of videos which can be switched by making a fist. I also altered the playback speed of the video using the position of the right palm over the sensor.
Instead of using the problematic beat detection object from the first version, I instead built a simple tap for bpm. I did this through a timer and some zl functions.
If I were to continue this further I would look into more interesting parameters to tweak as well as finding some ways to add some more visual diversity.