Waving our arms around has been something we associate with music, whether that’s playing air guitar, conducting at The Proms or simply waving them in the air like we just don’t care. Over the last couple of years using our bodies to control devices via gestures has become a reality, inspired perhaps by Tom Cruise waving his arms about in the film Minority Report, but with limited success. Technology such as the Xbox Kinect and Leap Motion have had a good go at it but neither have set the world alight. One of the latest attempts is by a company called Thalmic labs who created the Myo, a wearable arm band filed with EMG muscle sensors and motion sensors that combine the twists of your arm with movement to generate controlling data. What does all this have to do with music? Well, Precision Music Technology has adopted the Myo and produced a piece of software called Leviathan which takes the gesture data and feeds it into DAW software.
Leviathan allows you to perform chords and control fx parameters using customisable gestures via MIDI and OSC. A hand movement, for example, could change a chord, an arm rotation could control filter cut-off, or a grab could kick off a new scene. The Myo senses gesture and motion control to interpret what your hands and fingers are doing and transmits that over Bluetooth to your Mac. Leviathan operates as MIDI input to your DAW, running in the background and interpreting your gestures. You set up what you want to control within Leviathan, which then pushes the MIDI to your DAW. The website has video tutorials on how to set it up with all the major DAWs, which is very helpful.
For live performance I can see how this can be an exciting technology, in the studio perhaps less so. In fact, economy of movement can be an important factor in getting through long days making and mixing music. That’s always the problem: arm ache. But the Myo technology is the best I’ve seen yet so maybe it’s time to man up and start gesturing at our computers.