Digital Sensing of Musical Instruments

Acoustic musical instruments enable very rich and subtle control when used by experienced musicians. Musicology has traditionally focused on analysis of scores and more recently audio recordings. However, most music from around the world is not notated, and many nuances of music performance are hard to recover from audio recordings. In this chapter, we describe hyperinstruments, i. e., acoustic instruments that are augmented with digital sensors for capturing performance information and in some cases offering additional playing possibilities. Direct sensors are integrated onto the physical instrument, possibly requiring modifications. Indirect sensors such as cameras and microphones can be used to analyze performer gestures without requiring modifications to the instrument. We describe some representative case studies of hyperinstruments from our own research as well as some representative case studies of the types of musicological analysis one can perform using this approach, such as performer identification, microtiming analysis, and transcription. Until recently, hyperinstruments were mostly used for electroacoustic music creation, but we believe they have a lot of potential in systematic musicological applications involving music performance analysis.

[1]  Marcelo M. Wanderley,et al.  New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series) , 2006 .

[2]  Ajay Kapur,et al.  The Electronic Sitar Controller , 2004, NIME.

[3]  Cléo Palacio-Quintin,et al.  The Hyper-Flute , 2003, NIME.

[4]  Anssi Klapuri,et al.  Signal Processing Methods for Music Transcription , 2006 .

[5]  Roberto Bresin,et al.  Touch and temporal behavior of grand piano actions. , 2005, The Journal of the Acoustical Society of America.

[6]  Matthew Wright,et al.  Open SoundControl: A New Protocol for Communicating with Sound Synthesizers , 1997, ICMC.

[7]  Ajay Kapur,et al.  A Comparison of Sensor Strategies for Capturing Percussive Gestures , 2005, NIME.

[8]  Ajay Kapur,et al.  Training Surrogate Sensors in Musical Gesture Acquisition Systems , 2011, IEEE Transactions on Multimedia.

[9]  Andrew P. McPherson TouchKeys: Capacitive Multi-Touch Sensing on a Physical Keyboard , 2012, NIME.

[10]  W. Andrew Schloss Recent Advances in the Coupling of the Language MAX with the Matthews/Boie Radio Drum , 1990, ICMC.

[11]  Matthew Burtner,et al.  The Metasaxophone: concept, implementation, and mapping strategies for a new computer music instrument , 2002, Organised Sound.

[12]  Ajay Kapur,et al.  Multimodal Musician Recognition , 2010, NIME.

[13]  Julius O. Smith,et al.  A MIDI Control and Performance System for Brass Instruments , 1993, ICMC.

[14]  Peter F. Driessen,et al.  An Easily Removable, wireless Optical Sensing System (EROSS) for the Trumpet , 2013, NIME.

[15]  George Tzanetakis,et al.  Analyzing Afro-Cuban Rhythms using Rotation-Aware Clave Template Matching with Dynamic Programming , 2008, ISMIR.