musician quiddity

I want to use motion detection to examine the of musicians as they play, particularly looking at the relationships of their movements.

Some ideas:

  1. frame differentiation:

    look at actual pixels that are moving. this is very easy, but may yield only small smudges of motion.
  2. frame diff -> threshold -> blob
    rather then show the actual pixels, create a shape from the pixels and expand it out into an abstracted blob.
    – thinking a lot about Forsythe, formalizing movement
    – Dalcroze Eurhythmics, https://www.cmu.edu/cfa/music/people/Bios/neely_stephen.html
  3. body tracking
    will capture more abstract high level objects rather then just pixels, but will create more ‘standard’ shapes. I worry this is gonna be kinda ‘filtery’
  4. maybe combine both somehow