A Musical Improvisation System That Provides Visual Feedback to the Performer

This report describes the design and realization of Mimi, a multi-modal interactive musical improvisation system that explores the potential and powerful impact of visual feedback in performer-machine interaction. Mimi is a performer-centric tool designed for use in performance and teaching. Its key and novel component is its visual interface, designed to provide the performer with instantaneous and continuous information on the state of the system. For human improvisation, in which context and planning are paramount, the relevant state of the system extends to the near future and recent past. The Mimi system, designed and implemented using the SAI framework, successfully integrates symbolic computations and real-time synchronization in a multi-modal interactive setting. Mimi’s visual interface allows for a peculiar blend of raw reflex typically associated with improvisation, and preparation and timing more closely aliated with score-based reading. Mimi is not only an eective improvisation partner, it has also proven itself to be an invaluable platform through which to interrogate the mental models necessary for successful improvisation.

[1]  Shlomo Dubnov,et al.  OMax brothers: a dynamic yopology of agents for improvization learning , 2006, AMCMM '06.

[2]  William Walker,et al.  Applying ImprovisationBuilder to Interactive Composition with MIDI Piano , 1996, ICMC.

[3]  Gil Weinberg,et al.  Robot-human interaction with an anthropomorphic percussionist , 2006, CHI.

[4]  William F. Walker,et al.  A computer participant in musical improvisation , 1997, CHI.

[5]  Shlomo Dubnov,et al.  Using Factor Oracles for Machine Improvisation , 2004, Soft Comput..

[6]  George E. Lewis Too Many Notes: Computers, Complexity and Culture in Voyager , 2000, Leonardo Music Journal.

[7]  Elaine Chew,et al.  An Architectural Framework for Interactive Music Systems , 2006, NIME.

[8]  Belinda Thom,et al.  Interactive Improvisational Music Companionship: A User-Modeling Approach , 2003, User Modeling and User-Adapted Interaction.

[9]  Colin Potts,et al.  Design of Everyday Things , 1988 .

[10]  Belinda Thom,et al.  BoB: an interactive improvisational music companion , 2000, AGENTS '00.

[11]  Camilo Rueda,et al.  Computer Assisted Composition at Ircam , 1999 .

[12]  Roger B. Dannenberg A Language for Interactive Audio Applications , 2002, ICMC.

[13]  Miller S. Puckette A divide between 'compositional' and 'performative' aspects of Pd ⁄ , 2004 .

[14]  Shlomo Dubnov,et al.  Improvisation Planning and Jam Session Design using concepts of Sequence Variation and Flow Experience , 2005 .

[15]  Carlos Agon,et al.  OpenMusic 5: A Cross-Platform Release of the Computer-Assisted Composition Environment , 2005 .

[16]  William Walker,et al.  Improvisational Builder: Improvisation as Conversation , 1992, ICMC.

[17]  Alexandre R. J. François,et al.  A hybrid architectural style for distributed parallel processing of generic data streams , 2004, Proceedings. 26th International Conference on Software Engineering.

[18]  Maxime Crochemore,et al.  Factor Oracle: A New Structure for Pattern Matching , 1999, SOFSEM.

[19]  Carla Scaletti,et al.  ImprovisationBuilder: improvisation as conversation , 1992 .

[20]  François Pachet,et al.  The Continuator: Musical Interaction With Style , 2003, ICMC.