Strategies for Managing Timbre and Interaction in Automatic Improvisation Systems

ABSTRACT Earlier interactive improvisation systems have mostly worked with note-level musical events such as pitch, loudness and duration. Timbre is an integral component of the musical language of many improvisers; some recent systems use timbral information in a variety of ways to enhance interactivity. This article describes the timbre-aware ARHS improvisation system, designed in collaboration with saxophonist John Butcher, in the context of recent improvisation systems that incorporate timbral information. Common practices in audio feature extraction, performance state characterization and management, response synthesis and control of improvising agents are summarized and compared.

[1]  Oliver Bown,et al.  Continuous-Time Recurrent Neural Networks for Generative and Interactive Musical Performance , 2006, EvoWorkshops.

[2]  Tristan Jehan,et al.  An Audio-Driven Perceptually Meaningful Timbre Synthesizer , 2001, ICMC.

[3]  William Hsu,et al.  Managing Gesture and Timbre for Analysis and Instrument Control in an Interactive Environment , 2006, NIME.

[4]  David Plans Casal Time after Time : Short-circuiting the Emotional Distance between Algorithm and Human Improvisors , 2008, ICMC.

[5]  Mikhail Malt,et al.  Real-Time Uses of Low Level Sound Descriptors as Event Detection Functions , 2011 .

[6]  Tim Blackwell,et al.  Self-organised music , 2004 .

[7]  George E. Lewis Too Many Notes: Computers, Complexity and Culture in Voyager , 2000, Leonardo Music Journal.

[8]  Nick Collins,et al.  The Cambridge Companion to Electronic Music , 2007 .

[9]  W. Marsden I and J , 2012 .

[10]  N. M. Collins,et al.  Towards autonomous agents for live computer music : realtime machine listening and interactive music systems , 2007 .

[11]  Thomas Ciufo Beginner's Mind: an Environment for Sonic Improvisation , 2005, ICMC.

[12]  Marc Sosnick-Pérez,et al.  Evaluating Interactive Music Systems: An HCI Approach , 2009, NIME.

[13]  R. Dean The Oxford Handbook of Computer Music , 2011 .

[14]  William Hsu Two Approaches for Interaction Management in Timbre-Aware Improvisation Systems , 2008, ICMC.

[15]  Thomas Ciufo,et al.  Design Concepts and Control Strategies for Interactive Improvisational Music Systems , 2003 .

[16]  Lie Lu,et al.  Automatic mood detection and tracking of music audio signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[17]  Matthew John Yee-King An Automated Music Improviser Using a Genetic Algorithm Driven Synthesis Engine , 2007, EvoWorkshops.

[18]  Michael A. Casey Soundspotting: A New Kind of Process? , 2011 .

[19]  Michael Young,et al.  Nn Music: improvising with a 'Living' Computer , 2008, ICMC.

[20]  Bob L. Sturm,et al.  Proceedings of the International Computer Music Conference , 2011 .

[21]  Nick Collins,et al.  Reinforcement Learning for Live Musical Agents , 2008, ICMC.

[22]  Federico Avanzini,et al.  Proceedings of Sound and Music Computing Conference , 2011 .

[23]  Yi-Hsuan Yang,et al.  Music emotion classification: a fuzzy approach , 2006, MM '06.

[24]  Jonas Braasch,et al.  A SYSTEM FOR MUSICAL IMPROVISATION COMBINING SONIC GESTURE RECOGNITION AND GENETIC ALGORITHMS , 2009 .

[25]  P. D. Lickiss,et al.  BREAKING THE SOUND BARRIER , 1996 .