Towards a Model for Instrumental Mapping in Expert Musical Interaction

This paper reviews models of the ways in which performer instrumental actions can be linked to sound synthesis parameters. We analyse available literature on both acoustical instrument simulation and mapping of input devices to sound synthesis in general human-computer interaction. We further demonstrate why a more complex mapping strategy is required to maximise human performance possibilities in expert manipulation situations by showing clear measurements of user performance improvement over time. We finally discuss a general model for instrumental mapping, by separating the mapping layer into two independent parts. This model allows the expressive use of different input devices within the same architecture, or conversely, the use of different synthesis algorithms, by only changing one part of the mapping layer.

[1]  Roel Vertegaal,et al.  ISEE: An Intuitive Sound Editing Environment , 1994 .

[2]  David Wessel,et al.  Connectionist Models for Real-Time Control of Synthesis and COmpositional Algorithms , 1992, ICMC.

[3]  Teresa Marrin Nakra,et al.  Inside the conductor's jacket: analysis, interpretation and musical synthesis of expressive gesture , 2000 .

[4]  Insook Choi,et al.  A Manifold Interface for Kinesthetic Notation in High-Dimensional Systems , 2000 .

[5]  W. Andrew Schloss Recent Advances in the Coupling of the Language MAX with the Matthews/Boie Radio Drum , 1990, ICMC.

[6]  Reinhard Kopiez,et al.  Controlling Creative Processes in Music , 1998 .

[7]  Joel Ryan Some remarks on musical instrument design at STEIM , 1991 .

[8]  Todd Winkler,et al.  Making Motion Musical: Gesture Mapping Strategies for Interactive Computer Music , 1995, ICMC.

[9]  Sidney Fels,et al.  Empty-handed Gesture Analysis in Max/FTS , 1997 .

[10]  Neil Gershenfeld,et al.  MIT-Media Lab , 1991, ICMC.

[11]  Ian Bowler,et al.  On Mapping N Articulation Onto M Synthesiser-control Parameters , 1990, International Conference on Mathematics and Computing.

[12]  Camille Goudeseune,et al.  A Manifold Interface for a High Dimensional Control Space , 1995, International Conference on Mathematics and Computing.

[13]  Ross Kirk,et al.  Mapping Strategies for Musical Performance , 2000 .

[14]  Geoffrey E. Hinton,et al.  Glove-TalkII: Mapping Hand Gestures to Speech Using Neural Networks , 1994, NIPS.

[15]  Stuart Favilla,et al.  Non-linear Controller Mapping for Gestural Control of Gamaka , 1996, International Conference on Mathematics and Computing.

[16]  Marcelo M. Wanderley,et al.  ESCHER-modeling and performing composed instruments in real-time , 1998, SMC'98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.98CH36218).

[17]  Camille Goudeseune,et al.  Performance Factors in Control of High-Dimensional Space , 1999, ICMC.

[18]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[19]  David Wessel,et al.  Timbre Space as a Musical Control Structure , 1979 .

[20]  Andy Hunt,et al.  Radical user interfaces for real-time control , 1999, Proceedings 25th EUROMICRO Conference. Informatics: Theory and Practice for the New Millennium.

[21]  Marcelo M. Wanderley,et al.  Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance , 1997 .