Real-time Composition as Performance Ecosystem

This article proposes that real-time composition can be considered a new performance ecosystem. Rather than an extension of electroacoustic instruments that are used within improvisatory environments, real-time composition systems are produced by composers interested in gestural interactions between musical agents, with or without real-time control. They are a subclass of interactive systems, specifically a genre of interactive composition systems that share compositional control between composer and system. Designing the complexity of interactions between agents is a compositional act, and its outcomes are realised during performance-more so than most interactive systems, the new performance ecosystem is compositional in nature.

[1]  Mitchell Whitelaw,et al.  Metacreation: Art and Artificial Life , 2004 .

[2]  Joel Chadabe,et al.  Interactive Composing: An Overview , 1984, ICMC.

[3]  Jon McCormack,et al.  Life's What You Make: Niche Construction and Evolutionary Art , 2009, EvoWorkshops.

[4]  Alex Rae,et al.  A Real-Time Genetic Algorithm in Humanrobot Musical Improvisation , 2008, ICMC.

[5]  Christopher Small Musicking: The Meanings of Performing and Listening , 1999 .

[6]  W. Weaver Science and complexity. , 1948, American scientist.

[7]  Anna Laura Arpel Joel Chadabe: Electric Sound: The Past and Promise of Electronic Music , 1997 .

[8]  Christopher G. Langton,et al.  Artificial Life , 2019, Philosophical Posthumanism.

[9]  Sergi Jordà Digital Lutherie Crafting musical computers for new musics' performance and improvisation , 2005 .

[10]  Graeme Ritchie,et al.  Some Empirical Criteria for Attributing Creativity to a Computer Program , 2007, Minds and Machines.

[11]  Jon McCormack,et al.  Eden: An Evolutionary Sonic Ecosystem , 2001, ECAL.

[12]  Simon Waters Performance Ecosystems: Ecological approaches to musical interaction , 2006 .

[13]  Oliver Bown,et al.  Continuous-Time Recurrent Neural Networks for Generative and Interactive Musical Performance , 2006, EvoWorkshops.

[14]  Jon R Drummond,et al.  Understanding Interactive Systems , 2009, Organised Sound.

[15]  Arne Eigenfeldt,et al.  The Evolution of Evolutionary Software: Intelligent Rhythm Generation in Kinetic Engine , 2009, EvoWorkshops.

[16]  D. Schwarz,et al.  Corpus-Based Concatenative Synthesis , 2007, IEEE Signal Processing Magazine.

[17]  George E. Lewis Too Many Notes: Computers, Complexity and Culture in Voyager , 2000, Leonardo Music Journal.

[18]  Garth Paine,et al.  Interactivity, where to from here? , 2002, Organised Sound.

[19]  Arne Eigenfeldt,et al.  Hierarchical Sequential Memory for Music: A Cognitive Model , 2009, ISMIR.

[20]  Michael Luck,et al.  Proceedings of the First International Conference on Multi-Agent Systems , 1995 .

[21]  Bob L. Sturm,et al.  Proceedings of the International Computer Music Conference , 2011 .

[22]  Bruce Ellis Benson,et al.  The Improvisation of Musical Dialogue: Index , 2003 .

[23]  François Pachet,et al.  The Continuator: Musical Interaction With Style , 2003, ICMC.

[24]  Arne Eigenfeldt,et al.  Emergent Rhythms through Multi-agency in Max/MSP , 2008, CMMR.

[25]  Bruce Ellis Benson The Improvisation of Musical Dialogue by Bruce Ellis Benson , 2003 .

[26]  Michael Wooldridge,et al.  Introduction to multiagent systems , 2001 .

[27]  Agostino Di Scipio ‘Sound is the interface’: from interactive to ecosystemic signal processing , 2003 .

[28]  Wolfgang Banzhaf,et al.  Advances in Artificial Life , 2003, Lecture Notes in Computer Science.

[29]  Ian Whalley,et al.  Software Agents in Music and Sound Art Research/Creative Work: current state and a possible direction , 2009, Organised Sound.