A Framework for the Evaluation of Digital Musical Instruments

At the outset of a discussion of evaluating digital musical instruments (DMIs)—that is to say, instruments whose sound generators are digital and separable (though not necessarily separate) from their control interfaces (Malloch et al. 2006)—it is reasonable to ask what the term evaluation in this context really means. After all, there may be many perspectives from which to view the effectiveness of the instruments we build. For most performers, performance on an instrument becomes a means of evaluating how well it functions in the context of live music making, and their measure of success is the response of the audience to their performance. Audiences evaluate performances on the basis of how engaged they feel by what they have seen and heard. When questioned, they are likely to describe good performances as “exciting,” “skillful,” “musical.” Bad performances are “boring,” and those which are marred by technical malfunction are often dismissed out of hand. If performance is considered to be a valid means of evaluating a musical instrument, then it follows that, for the field of DMI design, a much broader definition of the term “evaluation” than that typically used in human–computer interaction (HCI) is required to reflect the fact that there are a number of stakeholders involved in the design and evaluation of DMIs. In addition to players and audiences, there are also composers, instrument builders, component manufacturers, and perhaps even customers. And each of these stakeholders may have a different concept of what is meant by “evaluation.” Composers, for example, may evaluate an instrument in terms of how reliable it is. If a composer writes a piece of instrumental music to be performed on a DMI, then they ought to be able to assume that (1) the instrumentalist is skilled on their instrument, and (2) the instrument has a known space of sound attributes that the composer can draw upon for musical effect.

[1]  M.D.T. de Jong,et al.  Exploring two methods of usability testing: concurrent versus retrospective think-aloud protocols , 2003 .

[2]  Jens Rasmussen,et al.  Information Processing and Human-Machine Interaction: An Approach to Cognitive Engineering , 1986 .

[3]  Matthew Wright,et al.  Problems and prospects for intimate musical control of computers , 2001 .

[4]  Mark D. Plumbley,et al.  Discourse Analysis Evaluation Method for Expressive Musical Interfaces , 2008, NIME.

[5]  Alan Cooper,et al.  About Face: The Essentials of User Interface Design , 1995 .

[6]  Nicola Orio,et al.  Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI , 2001, Computer Music Journal.

[7]  Jens Rasmussen,et al.  Information Processing and Human-Machine Interaction , 1986 .

[8]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[9]  Camille Goudeseune,et al.  A Manifold Interface for a High Dimensional Control Space , 1995, International Conference on Mathematics and Computing.

[10]  Stefania Serafin,et al.  Physical Synthesis of Bowed String Instruments , 2007 .

[11]  Jarmo Laaksolahti,et al.  Evaluating experience-focused HCI , 2007, CHI Extended Abstracts.

[12]  W. Andrew Schloss,et al.  Using Contemporary Technology in Live Performance: The Dilemma of the Performer , 2003 .

[13]  Robert A. Moog,et al.  Theremin: Ether Music and Espionage , 2000 .

[14]  A. Dawn Shaikh,et al.  Thinking but not seeing: think-aloud for non-sighted users , 2007, CHI Extended Abstracts.

[15]  I. Scott MacKenzie,et al.  CHAPTER 3 – Motor Behavior Models for Human-Computer Interaction , 2003 .

[16]  Geraldine Fitzpatrick,et al.  HCI Methodology For Evaluating Musical Controllers: A Case Study , 2008, NIME.

[17]  Bob L. Sturm,et al.  Proceedings of the International Computer Music Conference , 2011 .

[18]  Perry R. Cook,et al.  Re-Designing Principles for Computer Music Controllers: a Case Study of SqueezeVox Maggie , 2009, NIME.

[19]  Sergi Jordà,et al.  Digital Instruments and Players: Part II-Diversity, Freedom and Control , 2004, ICMC.

[20]  Garth Paine,et al.  The Thummer Mapping Project (ThuMP) , 2007, NIME '07.

[21]  Marcelo M. Wanderley,et al.  TOWARDS A NEW CONCEPTUAL FRAMEWORK FOR DIGITAL MUSICAL INSTRUMENTS , 2006 .

[22]  R. Benjamin Knapp,et al.  Sensory Chairs: A System for Biosignal Research and Performance , 2008, NIME.

[23]  Kenneth P. Fishkin,et al.  A taxonomy for and analysis of tangible interfaces , 2004, Personal and Ubiquitous Computing.

[24]  Perry R. Cook,et al.  Creating a Network of Integral Music Controllers , 2006, NIME.

[25]  Franca Garzotto,et al.  Usability, playability, and long-term engagement in computer games , 2009, CHI Extended Abstracts.

[26]  Colin Potts,et al.  Design of Everyday Things , 1988 .

[27]  Perry R. Cook,et al.  Principles for Designing Computer Music Controllers , 2001, NIME.

[28]  Sarah Nicolls,et al.  Seeking Out the Spaces Between: Using Improvisation in Collaborative Composition with Interactive Technology , 2010, Leonardo Music Journal.

[29]  Christopher Dobrian,et al.  The 'E' in NIME: Musical Expression with New Computer Interfaces , 2006, NIME.

[30]  Stefania Serafin,et al.  A Quantitative Evaluation of the Differences between Knobs and Sliders , 2009, NIME.

[31]  John Millar Carroll HCI Models, Theories, and Frameworks: Toward a Multidisciplinary Science , 2003 .

[32]  Peta Wyeth,et al.  GameFlow: a model for evaluating player enjoyment in games , 2005, CIE.

[33]  Cornelius Pöpel,et al.  On interface expressivity: A player based study , 2005, NIME.

[34]  Max V. Mathews The radio baton and conductor program, or, pitch, the most important and least expressive part of music , 1991 .

[35]  G. Johansson Visual perception of biological motion and a model for its analysis , 1973 .

[36]  Paul Stapleton,et al.  Where Did It All Go Wrong ? A Model of Error From the Spectator's Perspective , 2009, NIME.

[37]  Ross Kirk,et al.  Mapping Strategies for Musical Performance , 2000 .

[38]  Jacob Buur,et al.  Getting a grip on tangible interaction: a framework on physical space and social interaction , 2006, CHI.

[39]  I. Scott MacKenzie,et al.  Motor Behaviour Models for Human-Computer Interaction , 2002 .

[40]  Tina Blaine,et al.  Contexts of Collaborative Musical Experiences , 2003, NIME.

[41]  Ronen Barzel,et al.  Audio Anecdotes II: Tools, Tips, and Techniques for Digital Audio , 2004 .

[42]  Garth Paine,et al.  Towards a Taxonomy of Realtime Interfaces for Electronic Music Performance , 2010, NIME.

[43]  Joseph F. Dumas,et al.  A Practical Guide to Usability Testing , 1993 .

[44]  Austin Henderson,et al.  Making sense of sensing systems: five questions for designers and researchers , 2002, CHI.

[45]  A. Friberg,et al.  Visual Perception of Expressiveness in Musicians' Body Movements , 2007 .

[46]  Sarah Louise Nicolls,et al.  Interacting with the piano , 2010 .