Non-invasive sensing and gesture control for pitched percussion hyper-instruments using the Kinect

Hyper-instruments extend traditional acoustic instruments with sensing technologies that capture digitally subtle and sophisticated aspects of human performance. They leverage the long training and skills of performers while simultaneously providing rich possibilities for digital control. Many existing hyper-instruments suffer from being one of a kind instruments that require invasive modifications to the underlying acoustic instrument. In this paper we focus on the pitched percussion family and describe a non-invasive sensing approach for extending them to hyper-instruments. Our primary concern is to retain the technical integrity of the acoustic instrument and sound production methods while being able to intuitively interface the computer. This is accomplished by utilizing the Kinect sensor to track the position of the mallets without any modification to the instrument which enables easy and cheap replication of the proposed hyper-instrument extensions. In addition we describe two approaches to higher-level gesture control that remove the need for additional control devices such as foot pedals and fader boxes that are frequently used in electro-acoustic performance. This gesture control integrates more organically with the natural flow of playing the instrument providing user selectable control over filter parameters, synthesis, sampling, sequencing, and improvisation using a commercially available low-cost sensing apparatus.

[1]  Sofia Dahl,et al.  Expressiveness of Musician's Body Movements in Performances on Marimba , 2003, Gesture Workshop.

[2]  Robert Rowe,et al.  Machine Musicianship , 2001 .

[3]  Jörn Loviscach,et al.  A versatile expressive percussion instrument with game technology , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[4]  Ajay Kapur,et al.  Integrating hyperinstruments, musical robots & machine musicianship for North Indian classical music , 2007, NIME '07.

[5]  Peter Driessen,et al.  Sensor fusion: Towards a fully expressive 3D music control interface , 2011, Proceedings of 2011 IEEE Pacific Rim Conference on Communications, Computers and Signal Processing.

[6]  Dan Overholt,et al.  The Overtone Violin , 2005, NIME.

[7]  Gil Weinberg,et al.  Toward Robotic Musicianship , 2006, Computer Music Journal.

[8]  Miller Puckette,et al.  Real-time audio analysis tools for Pd and MSP , 1998, ICMC.

[9]  In-Kwon Lee,et al.  Creating Musical Expression using Kinect , 2011, NIME.

[10]  Joseph T. Chung,et al.  Hyperinstruments: Musically Intelligent and Interactive Performance and Creativity Systems , 1989, ICMC.