Exercise: Motion Primitives

This exercise continues Exercise: Mechanical Duo into developing a short performance using the simple machines already developed. The high-level goal is to work beyond the individual gesture toward a composition which expresses a larger idea or story. The expressive performance goals remain the same, including creating a sense of identity, embodying empathy and communication, and expressing a relationship. These are now applied over time to indicate contrast and progression.

In terms of dance or music, this might be considered developing a longer passage out of individual phrases.

In dramatic terms, this might be considered creating a scene out of beats.

In robotics, this might be considered as composing motion primitives into a task solution. The idea of motion primitive is broadly defined in the literature, but generally includes the notion of a repeatable, parameterized action constituting a sensible atomic unit of activity for a given system. E.g., for a grasping hand, moving multiple fingers into contact on a rigid object might itself be a complex process, but is generally a logical unit of activity with a well-defined post condition.

In computer science, this goal might map naturally into decomposing a task into a hierarchy of functions. A motion primitive might be described as a procedure for execution and a state operator for search-based planning.

In all domains, a recurring theme is choosing the right level of descriptive abstraction and a good notation for the action. A convenient notation supports a composer in thinking in more abstract terms and working at broader scope.

The idea of improvisation is central to our study of composition. Written symbolic notation can be rich and dense but the process is constrained by the cognitive loads on the composer who must mentally model the outcome while encoding it. This stands in contrast to using a notation compatible with real-time interactive gestural expression, since that can be explored through improvisatory play in real time. This emphasizes observation and intuitive response, but perhaps de-emphasizes larger-scale formal structure. In the end, both improvisatory and deliberative composition complement each other, so it would be useful to choose abstractions that support either.

Learning Objectives

  1. Formulating a two-agent narrative as a physical animation concept.

  2. Identifying a set of motion primitives which can be composed to create narratives.

  3. Formulating a notation for encoding parameterized sequences of primitives.

  4. Implementing the notation in Python using structures compatible with both real-time improvisation and scripted autonomous execution.

  5. Exploring improvisation of the motions.

  6. Scripting an autonomous performance.

Primitives

In practical terms, we will develop a movement vocabulary and then trigger it with MIDI events in the context of a few global parameters. The StepperWinch firmware already models this at a very low level by encouraging thinking about damped sinusoids as the foundation for movement, with the frequency and damping as the parametric context.

The selection of motion primitives can be motivated by the essential artistic goals. E.g., if the performance is highly narrative, the primitives might correspond to individual dramatic beats including both actions which reveal intent and reactions which reveal emotion. Or if the performance is more about explicating a relationship, the primitives might include several basic interactional transactions (e.g. a touch, a retreat) which are parameterized by global mood settings (e.g. tempo and magnitude).

In both cases, these primitives can be used as building blocks for improvising varied scenes using sequence, repetition, and variation.

Scaffolding

The technical foundation for the exercise can be found in exercise4.zip. This version includes a sample application script and several tools and associated files:

exercise4.py exercise4.poses script/ex4demo.py dmx_controller.py list_MIDI_ports.py midi_display.py osc_display.py

More detailed documentation on the sample code can be found under exercise4.py sample code, and library documentation under rcp Python library.

Deliverables

  1. Video recording of short performance.

  2. Short blog post outlining goals and outcomes, video documentation, and final code.