Guided improvisation as dynamic calls to an offline model

This paper describes a reactive architecture handling the hybrid temporality of guided human-computer music improvisation. It aims at combining reactivity and anticipation in the music generation processes steered by a " scenario ". The machine improvisation takes advantage of the temporal structure of this scenario to generate short-term anticipations ahead of the performance time, and reacts to external controls by refining or rewriting these anticipa-tions over time. To achieve this in the framework of an interactive software, guided improvisation is modeled as embedding a compositional process into a reactive architecture. This architecture is instantiated in the improvisation system ImproteK and implemented in OpenMusic.

[1]  Sanjit A. Seshia,et al.  Control Improvisation , 2014, FSTTCS.

[2]  Gérard Assayag,et al.  Computer Assisted Composition today. , 1998 .

[3]  Jean-Louis Giavitto,et al.  Planning Human-Computer Improvisation , 2014, ICMC.

[4]  Shlomo Dubnov,et al.  OMax brothers: a dynamic yopology of agents for improvization learning , 2006, AMCMM '06.

[5]  Mark Kahrs Dream chip 1: a timed priority queue , 1993, IEEE Micro.

[6]  P. Maigret Reactive planning and control with mobile robots , 1992, IEEE Control Systems.

[7]  François Pachet,et al.  Markov constraints: steerable generation of Markov sequences , 2010, Constraints.

[8]  Edmund H. Durfee,et al.  UM-PRS: An implementation of the procedural reasoning system for multirobot applications , 1994 .

[9]  Gérard Assayag,et al.  OpenMusic: visual programming environment for music composition, analysis and research , 2011, ACM Multimedia.

[10]  Edmund H. Durfee,et al.  A Survey of Research in Distributed, Continual Planning , 1999, AI Mag..

[11]  Alexander Nareyek,et al.  A Real-Time Concurrent Planning and Execution Framework for Automated Story Planning for Games , 2011, Intelligent Narrative Technologies.

[12]  Shlomo Dubnov,et al.  Guided Music Synthesis with Variable Markov Oracle , 2014, MUME@AIIDE.

[13]  Jean-Louis Giavitto,et al.  Operational semantics of a domain specific language for real time musician–computer interaction , 2013, Discret. Event Dyn. Syst..

[14]  Derek Bailey Improvisation: Its nature and practice in music , 1980 .

[15]  Shlomo Dubnov,et al.  Feature Selection and Composition Using PyOracle , 2013, MUME@AIIDE.

[16]  Sanjit A. Seshia,et al.  Machine Improvisation with Formal Specifications , 2014, ICMC.

[17]  Laurent Bonnasse-Gahot An update on the SOMax project , 2014 .

[18]  Marc Chemillier,et al.  Improvisation musicale homme-machine guidée par un scénario temporel , 2014, Tech. Sci. Informatiques.

[19]  François Pachet,et al.  Reflexive loopers for solo musical improvisation , 2013, CHI.

[20]  Roger B. Dannenberg,et al.  Real-time scheduling and computer accompaniment , 1989 .

[21]  R. James Firby,et al.  An Investigation into Reactive Planning in Complex Domains , 1987, AAAI.

[22]  Jean-Louis Giavitto,et al.  A Dynamic Timed-Language for Computer-Human Musical Interaction , 2013 .

[23]  Elaine Chew,et al.  Mimi4x: An interactive audio-visual installation for high-level structural improvisation , 2010, 2010 IEEE International Conference on Multimedia and Expo.

[24]  François Pachet,et al.  Virtualband: Interacting with Stylistically Consistent Agents , 2013, ISMIR.

[25]  Jean Bresson,et al.  Planning and Scheduling Actions in a Computer-Aided Music Composition System , 2015 .