Mapping Out Instruments, Affordances, and Mobiles

This paper reviews and extends questions of the scope of an interactive musical instrument and mapping strategies for expressive performance. We apply notions of embodiment and affordance to characterize gestural instruments. We note that the democratization of sensor technology in consumer devices has extended the cultural contexts for interaction. We revisit questions of mapping drawing upon the theory of affordances to consider mapping and instrument together. This is applied to recent work by the author and his collaborators in the development of instruments based on mobile devices designed for specific performance situations.

[1]  Garth Paine Gesture and Morphology in Laptop Music Performance , 2011 .

[2]  Tina Blaine The Convergence of Alternate Controllers and Musical Interfaces in Interactive Entertainment , 2005, NIME.

[3]  Hiroshi Ishii,et al.  Fusing Computation into Mega-Affordance Objects , 2008 .

[4]  Tia DeNora,et al.  After Adorno: Rethinking Music Sociology , 2003 .

[5]  Andrew R. Brown,et al.  The educational affordances of generative media in arts education , 2010 .

[6]  William W. Gaver Technology affordances , 1991, CHI.

[7]  D. Norman The psychology of everyday things , 1990 .

[8]  Kenneth P. Fishkin,et al.  A taxonomy for and analysis of tangible interfaces , 2004, Personal and Ubiquitous Computing.

[9]  Jonathan Arnowitz,et al.  Michel Waisvisz: the man and the hands , 2005, INTR.

[10]  Atau Tanaka,et al.  Musical Performance Practice on Sensor-based Instruments , 2000 .

[11]  Robin Jeffries,et al.  CHI '06 Extended Abstracts on Human Factors in Computing Systems , 2006, CHI 2006.

[12]  Michael Gurevich JamSpace: a networked real-time collaborative music environment , 2006, CHI EA '06.

[13]  Eoin Brazil Proceedings of the 2002 conference on New interfaces for musical expression , 2002 .

[14]  Rolf Inge Godøy,et al.  Geometry and Effort in Gestural Renderings of Musical Sound , 2009, Gesture Workshop.

[15]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[16]  Kazuhiro Jo,et al.  Inaudible Computing: An Extension of Physical Computing using Audio Signals , 2009 .

[17]  Jonas Braasch,et al.  The Telematic Music System: Affordances for a New Instrument to Shape the Music of Tomorrow , 2009 .

[18]  Elaine King,et al.  Music and Gesture , 2006 .

[19]  Marcelo M. Wanderley,et al.  Mapping performer parameters to synthesis engines , 2002, Organised Sound.

[20]  Atau Tanaka,et al.  The Music Participates In , 2009 .

[21]  Michael Rohs,et al.  Developments and Challenges turning Mobile Phones into Generic Music Performance Platforms , 2008 .

[22]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[23]  Atau Tanaka,et al.  Multimodal Interaction in Music Using the Electromyogram and Relative Position Sensing , 2002, NIME.

[24]  Thor Magnusson Affordances and constraints in screen-based musical instruments , 2006, NordiCHI '06.

[25]  Todd Winkler Composing Interactive Music: Techniques and Ideas Using Max , 1998 .

[26]  Victor Lazzarini Erratum: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard , 2008, Computer Music Journal.

[27]  Graham Pullin,et al.  Tactophonics: your favourite thing wants to sing , 2007, NIME '07.

[28]  Bob L. Sturm,et al.  Proceedings of the International Computer Music Conference , 2011 .

[29]  Leonello Tarabella IMPROVISING COMPUTER MUSIC: AN APPROACH , 2004 .

[30]  Tekla Perry The iphone's music man , 2009, IEEE Spectrum.

[31]  David Hutchison,et al.  Gesture-Based Human-Computer Interaction and Simulation , 2009 .

[32]  Matthew Wright,et al.  Problems and Prospects for Intimate Musical Control of Computers , 2002, Computer Music Journal.

[33]  Atau Tanaka Sensor-Based Musical Instruments and Interactive Music , 2011 .

[34]  Antonio Camurri,et al.  Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library , 2003, Gesture Workshop.

[35]  O. H. Schmitt,et al.  A thermionic trigger , 1938 .

[36]  Michel Beaudouin-Lafon,et al.  Instrumental interaction: an interaction model for designing post-WIMP user interfaces , 2000, CHI.

[37]  Joseph J. LaViola,et al.  Exploring 3D gestural interfaces for music creation in video games , 2009, FDG.

[38]  Pierre Vérillon,et al.  Cognition and artifacts: A contribution to the study of though in relation to instrumented activity , 1995 .

[39]  Franziska Schroeder Performing Technology: User Content and the New Digital Media , 2009 .

[40]  Georg Essl SpeedDial: Rapid and On-The-Fly Mapping of Mobile Phone Instruments , 2009, NIME.

[41]  Claude Cadoz,et al.  Instrumental Gestures and Musical Composition , 1988, ICMC.

[42]  R. Dean The Oxford Handbook of Computer Music , 2011 .

[43]  D. Norman The psychology of everyday things", Basic Books Inc , 1988 .

[44]  Gordy Carlson Digital musical instruments , 1997 .

[45]  Joel Ryan Some remarks on musical instrument design at STEIM , 1991 .