Performer-centered visual feedback for human-machine improvisation

This article describes the design and implementation of the Multimodal Interactive Musical Improvisation (Mimi) system. Unique to Mimi is its visual interface, which provides the performer with instantaneous and continuous information on the state of the system, in contrast to other human-machine improvisation systems, which require performers to grasp and intuit possible extemporizations in response to machine-generated music without forewarning. In Mimi, the information displayed extends into the near future and reaches back into the recent past, allowing the performer awareness of the musical context so as to plan their response accordingly. This article presents the details of Mimi's system design, the visual interface, and its implementation using the formalism defined by François' Software Architecture for Immersipresence (SAI) framework. Mimi is the result of a collaborative iterative design process. We have recorded the design sessions and present here findings from the transcripts that provide evidence for the impact of visual support on improvisation planning and design. The findings demonstrate that Mimi's visual interface offers musicians the opportunity to anticipate and to review decisions, thus making it an ideal performance and pedagogical tool for improvisation. It allows novices to create more contextually relevant improvisations and experts to be more inventive in their extemporizations.

[1]  Shlomo Dubnov,et al.  Using Factor Oracles for Machine Improvisation , 2004, Soft Comput..

[2]  Elaine Chew,et al.  Visual feedback in performer-machine interaction for musical improvisation , 2007, NIME '07.

[3]  Gérard Assayag,et al.  Navigating the Oracle: a Heuristic Approach , 2007, ICMC.

[4]  Shlomo Dubnov,et al.  Improvisation Planning and Jam Session Design using concepts of Sequence Variation and Flow Experience , 2005 .

[5]  Shlomo Dubnov,et al.  OMax brothers: a dynamic yopology of agents for improvization learning , 2006, AMCMM '06.

[6]  William Walker,et al.  Applying ImprovisationBuilder to Interactive Composition with MIDI Piano , 1996, ICMC.

[7]  Camilo Rueda,et al.  Computer-Assisted Composition at IRCAM: From PatchWork to OpenMusic , 1999, Computer Music Journal.

[8]  Gérard Assayag,et al.  New computational paradigms for computer music , 2009 .

[9]  Maxime Crochemore,et al.  Factor Oracle: A New Structure for Pattern Matching , 1999, SOFSEM.

[10]  Elaine Chew,et al.  An Architectural Framework for Interactive Music Systems , 2006, NIME.

[11]  Belinda Thom,et al.  Interactive Improvisational Music Companionship: A User-Modeling Approach , 2003, User Modeling and User-Adapted Interaction.

[12]  Alexandre R. J. François,et al.  An Architectural Framework for the Design, Analysis and Implementation of Interactive Systems , 2011, Comput. J..

[13]  George E. Lewis Too Many Notes: Computers, Complexity and Culture in Voyager , 2000, Leonardo Music Journal.

[14]  Colin Potts,et al.  Design of Everyday Things , 1988 .

[15]  Belinda Thom,et al.  BoB: an interactive improvisational music companion , 2000, AGENTS '00.

[16]  William F. Walker,et al.  A computer participant in musical improvisation , 1997, CHI.

[17]  Carlos Agon,et al.  OpenMusic 5: A Cross-Platform Release of the Computer-Assisted Composition Environment , 2005 .

[18]  William Walker,et al.  Improvisational Builder: Improvisation as Conversation , 1992, ICMC.

[19]  Carla Scaletti,et al.  ImprovisationBuilder: improvisation as conversation , 1992 .

[20]  François Pachet,et al.  The Continuator: Musical Interaction With Style , 2003, ICMC.

[21]  Roger B. Dannenberg A Language for Interactive Audio Applications , 2002, ICMC.

[22]  Miller S. Puckette A divide between 'compositional' and 'performative' aspects of Pd ⁄ , 2004 .

[23]  Alexandre R. J. François,et al.  A hybrid architectural style for distributed parallel processing of generic data streams , 2004, Proceedings. 26th International Conference on Software Engineering.

[24]  Gil Weinberg,et al.  Toward Robotic Musicianship , 2006, Computer Music Journal.