The V Motion Project from Assembly on Vimeo.
The V Motion Project is a pretty successful motion tracking DJ performance, using a dancer, Kinect technology, and a great big wall to showcase various parameters controlled by the performer’s movements.
The track comes from Joel Little of Kids of 88 and Goodnight Nurse. Below is a track of his from Ko88, which you can download for free here in exchange for a subscription. Goodnight Nurse isn’t nearly as good so I’m not going to bother.
Anyway, the outstanding James Hayday broke Can’t Help Myself down and threw it up on Ableton, where lots of technical stuff happened that is not as complicated as you’d think.
In order to avoid lag, the bane of systems’ like this various existences, they actually ran two simultaneous Kinects tracking two separate aspects of the video input. To eliminate interference, they put a small motor into one to wiggle it constantly, causing info from the other camera to blur and thus be ignored. Cool, right? Here’s a short screen capture video of the Ableton session in action.
The music system works by connecting the Kinect camera to Ableton Live, music sequencing software usually used by Djs and musicians during live performances. Below is a screen capture of our Ableton setup. The interface is full of dials, knobs, switches and buttons. Normally, a musician would use a physical control panel covered with the knobs, dials, and switches to control Ableton’s virtual ones. Paul’s music system works by allowing us to map body movements to Ableton’s controls. For example, when you touch your head with your left hand a certain loop could start. Or you could control the dry/wet filter with the distance between your hands. This ability to map physical motion to actions in Ableton is enormously powerful.
There’s a wealth of information on the technology at the Custom Logic site about the evolution of the air keyboard from being a grid in front of the player to a semicircle around him. It also discusses the lag issue in greater depth and talks more about which specific parameters are controlled. Now that I’ve linked to it, I shall now post from it!
Triggering vox samples, low frequency oscillations a.k.a. dubsteppiness, drum filters, and my favorite “ball of dough” controller. The performer can cycle through various settings for different song sections and parameters like how close his chest is to the ground controlling volume.
Overall, this is a great step toward engaging electronic performance. While we’re still dealing with certain parameters controlled live alongside lots of prerecorded tracks, CPU power is inching toward that stage where we’ll finally have the processing capability to have a live performance on par with what we’ve got in our heads. Or will it never be as engaging as four cats with instruments and microphones? I guess we’ll just have to wait and see.