samuelgo – Project 2: F8R

For this project I developed an interactive performance system powered by Max.

The basic premise of the patch is to capture and playback gestural input. Max outputs these gestures as MIDI CC data, which are then converted to CV signals that control a hardware synthesizer.

Three Max objects play a major role in this patch:

  1. mira.frame – The graphical interface for gestural input is designed in a mira.frame object. The mira.frame object mirrors the interface on an iPad connected to the computer running Max over WiFi.
  2. mira.multitouch – The mira.multitouch object allows us to collect multitouch information from the iPad hosting the mira.frame interface. Touch state and y-position are the key information collected in this patch.
  3. mtr – The mtr object, wrapped with some custom logic, records and plays back the gestural input data from the mira.multitouch object.

The core engine in this patch can be extended or augmented to support many types of gestural input. In this implementation the graphical interface consists of 8 faders whose values can be set or automated.

The embedded video demonstrates the patch with a live performance.

Project Resources: https://drive.google.com/drive/u/0/folders/1htNu8UGfB6_NB_QtNnzEGnrOOXTfRYm2