For the final project I decided to further explore the connection between motion and sound. I incorporated data from the Myo armband into a music synthesizer that used several techniques I have learned from this class.
The synthesizer is composed of two main parts: the motion data reading section and the music control section. I used an online myo-osc communication application (https://github.com/samyk/myo-osc) and udp messaging to read the armband data. I am able to obtain normalized quaternion metrics as well as several gesture readings. These data laid a solid foundation for a stable translation from motion to sound.
I selected pitch, playback speed, timbre and reverberation as the manipulation parameters. I downloaded music as separate instrument stems so that I can play with the parameters on individual track without interfering with the overall music flow. After many trials, I eventually had the following mapping relationships:
- The up/down motion of the arm will change the pitch of the timbani instrument.
- The left/right motion of the arm will change the playback speed of both timbani and percussion part of the music.
- The fist/rest gesture will switch between piano-based and bass-based core melody.
- The rotation motion of the arm will change the reverberation delay time of the piano melody.
I recorded a section of the generated music, which is shown below:
The code for the project is as follows: