As a musician the concept of interacting with your music, or what we might call “playing your instrument” seems so obvious that you’d be forgiven for wondering what design has to do with it. Anything can become an instrument, you pick it up and interact with it, bang it, slap it, blow it, pluck it. But in the digital realm all of that interactivity is so often narrowed down to small movements of your hand on a mouse. Interaction Design looks for meaningful relationships between people and the products that they use and in this case the focus is on the relationship between the musician and the computer.
I have a personally strong interest in exploring different ways of making music on computers. I’ve had a large touch screen for years, produced articles for Sound On Sound magazine about using multi-touch in music software, and use a Microsoft Surface for my own live performances. I’ve been fascinated with gesture control, sensor and virtual reality technology and have made my arms ache trying to get to grips with technology such as Leapmotion. Along with virtual and augmented reality we’ve seen an increase in wearable controllers, gloves and armbands which interpret our movements and feed them into the computer to help us shape our music. On the consumer side of the experience musical games and technology like the Xbox Kinect have enabled listeners to interact and change what they’re listening to. It’s a very exciting time and it’s only just being explored.
MiXD 2016 hopes to bring together designers, experimenters, musicians and artists to the Birmingham Conservatoire to draw new perspectives on user interface and interaction design for music and sound. They are inviting demo’s from anyone with something to offer on a range of topics including; software/hardware interfaces; novel controllers; sensor technology; augmented and mixed reality; game audio and sound design. Submissions deadline is the 24th February and the event is to be held on the 16th March 2016.
More information: http://integra.io/mixd2016/