Controlling Physically Based Virtual Musical Instruments Using The Gloves

In this paper we propose an empirical method to develop mapping strategies between a gestural-based interface (the Gloves) and physically based sound synthesis models. An experiment was conducted to investigate which gestures listeners associate with sounds synthesised using physical models, corresponding to three categories of sound: sustained, iterative and impulsive. The results of the experiment show that listeners perform similar gestures when controlling sounds from the different categories. We used such gestures in order to create the mapping strategy between the Gloves and the physically based synthesis engine.

[1]  Alexander Refsum Jensenius,et al.  Exploring Music-Related Gestures by Sound-Tracing - A Preliminary Study , 2006 .

[2]  Claude Cadoz,et al.  Musique, geste, technologie , 1999 .

[3]  Loïc Kessous,et al.  Bimanuality in Alternate Musical Instruments , 2003, NIME.

[4]  M. Leman Embodied Music Cognition and Mediation Technology , 2007 .

[5]  Thomas Mitchell,et al.  Musical Interaction with Hand Posture and Orientation: A Toolbox of Gestural Control Mechanisms , 2012, NIME.

[6]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[7]  Atau Tanaka,et al.  Towards Gestural Sonic Affordances , 2013, NIME.

[8]  Davide Rocchesso,et al.  Interactive Simulation of rigid body interaction with friction-induced sound generation , 2005, IEEE Transactions on Speech and Audio Processing.

[9]  M. Leman,et al.  Musical gestures : sound, movement, and meaning , 2010 .

[10]  Marcelo M. Wanderley,et al.  The Importance of Parameter Mapping in Electronic Instrument Design , 2002, NIME.

[11]  Matthew Wright,et al.  Problems and prospects for intimate musical control of computers , 2001 .

[12]  Perry R. Cook,et al.  Physically Informed Sonic Modeling (PhISM): Synthesis of percussive sounds , 1997 .

[13]  Thomas Mitchell,et al.  SoundGrasp: A Gestural Interface for the Performance of Live Music , 2011, NIME.

[14]  Bert Bongers,et al.  Physical Interfaces in the Electronic Arts Interaction Theory and Interfacing Techniques for Real-time Performance , 2000 .

[15]  Thierry Dutoit HANDSKETCH: BI-MANUAL CONTROL OF VOICE QUALITY DIMENSIONS AND LONG TERM PRACTICE ISSUES , 2009 .

[16]  Jim Tørresen,et al.  Analyzing sound tracings: a multimodal approach to music information retrieval , 2011, MIRUM '11.

[17]  Julius O. Smith,et al.  Extensions of the Karplus-Strong Plucked-String Algorithm , 1983 .

[18]  Doug Van Nort,et al.  Instrumental Listening: sonic gesture as design principle , 2009, Organised Sound.

[19]  Todd Winkler,et al.  Making Motion Musical: Gesture Mapping Strategies for Interactive Computer Music , 1995, ICMC.

[20]  Pierre Schaeffer,et al.  Traité des objets musicaux : essai interdisciplines , 1966 .

[21]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[22]  Isabella Poggi,et al.  Gestures in performance , 2009 .

[23]  Patrick Susini,et al.  The Role of Sound Source Perception in Gestural Sound Description , 2014, TAP.