Visual feedback in performer-machine interaction for musical improvisation

This paper describes the design of Mimi, a multi-modal interactive musical improvisation system that explores the potential and powerful impact of visual feedback in performer-machine interaction. Mimi is a performer-centric tool designed for use in performance and teaching. Its key and novel component is its visual interface, designed to provide the performer with instantaneous and continuous information on the state of the system. For human improvisation, in which context and planning are paramount, the relevant state of the system extends to the near future and recent past. Mimi's visual interface allows for a peculiar blend of raw reflex typically associated with improvisation, and preparation and timing more closely affiliated with score-based reading. Mimi is not only an effective improvisation partner, it has also proven itself to be an invaluable platform through which to interrogate the mental models necessary for successful improvisation.

[1]  William Walker,et al.  Applying ImprovisationBuilder to Interactive Composition with MIDI Piano , 1996, ICMC.

[2]  Carla Scaletti,et al.  ImprovisationBuilder: improvisation as conversation , 1992 .

[3]  Elaine Chew,et al.  An Architectural Framework for Interactive Music Systems , 2006, NIME.

[4]  Belinda Thom,et al.  Interactive Improvisational Music Companionship: A User-Modeling Approach , 2003, User Modeling and User-Adapted Interaction.

[5]  George E. Lewis Too Many Notes: Computers, Complexity and Culture in Voyager , 2000, Leonardo Music Journal.

[6]  François Pachet,et al.  The Continuator: Musical Interaction With Style , 2003, ICMC.

[7]  Colin Potts,et al.  Design of Everyday Things , 1988 .

[8]  Belinda Thom,et al.  BoB: an interactive improvisational music companion , 2000, AGENTS '00.

[9]  Shlomo Dubnov,et al.  Improvisation Planning and Jam Session Design using concepts of Sequence Variation and Flow Experience , 2005 .

[10]  William F. Walker,et al.  A computer participant in musical improvisation , 1997, CHI.

[11]  Camilo Rueda,et al.  Computer Assisted Composition at Ircam , 1999 .

[12]  Gil Weinberg,et al.  Robot-human interaction with an anthropomorphic percussionist , 2006, CHI.

[13]  Shlomo Dubnov,et al.  OMax brothers: a dynamic yopology of agents for improvization learning , 2006, AMCMM '06.

[14]  Miller S. Puckette A divide between 'compositional' and 'performative' aspects of Pd ⁄ , 2004 .

[15]  Carlos Agon,et al.  OpenMusic 5: A Cross-Platform Release of the Computer-Assisted Composition Environment , 2005 .

[16]  William Walker,et al.  Improvisational Builder: Improvisation as Conversation , 1992, ICMC.

[17]  Shlomo Dubnov,et al.  Using Factor Oracles for Machine Improvisation , 2004, Soft Comput..

[18]  Alexandre R. J. François,et al.  A hybrid architectural style for distributed parallel processing of generic data streams , 2004, Proceedings. 26th International Conference on Software Engineering.