jhueteso – Project 2: LIDAR live interactive sequencer

For this project I wanted to try to do something interactive. The end goal was to have something people could use to make music in a cooperative and interactive way. An important requirement for this was responsiveness and intuitiveness. I wanted something where you could figure out what you were expected to do just by< moving things out.

For this project I used the LIDAR sensor I am currently using for my bachelor’s research project. A LIDAR sensor creates a 2D dimensional map of its by shooting a laser and waiting for the reflection (similar to a SONAR, hence the name).

2D map of a room
RPLIDAR - 360 degree Laser Scanner Development Kit
Sensor used

Now to play some music I divided the 360º rotation on a series of “slices”. Each slice represents a time delay for a note (think on the number on a clock, each one is played when the hand hits it).

These slices are given a value encoding the distance to the first object. We now have nice angle and distances we can use in Max .

My patch can play drums or play music. In the drums part, the distance to the sensor control the type of drum played: hihat, snare & hihat, snare or kick. In the melody section the distance to the sensor controls pitch. The closer to the sensor the higher the pitch. I added two filters in parallel to add some user input to control the final sound.

Finally, I added an interface using to render a series of cubes that represent the note being played.

Link to Drive