Systems for Interactive Control of Computer Generated Music Performance

This chapter is a literature survey of systems for real-time interactive control of automatic expressive music performance. A classification is proposed based on two initial design choices: the music material to interact with (i.e., MIDI or audio recordings) and the type of control (i.e., direct control of the low-level parameters such as tempo, intensity, and instrument balance or mapping from high-level parameters, such as emotions, to low-level parameters). Their pros and cons are briefly discussed. Then, a generic approach to interactive control is presented, comprising four steps: control data collection and analysis, mapping from control data to performance parameters, modification of the music material, and audiovisual feedback synthesis. Several systems are then described, focusing on different technical and expressive aspects. For many of the surveyed systems, a formal evaluation is missing. Possible methods for the evaluation of such systems are finally discussed.

[1]  Marco Fabiani Interactive computer-aided expressive music performance : Analysis, control, modification and synthesis , 2011 .

[2]  Teresa Marrin Nakra,et al.  You're The Conductor: A Realistic Interactive Conducting System for Children , 2004, NIME.

[3]  Marco Fabiani,et al.  MoodifierLive: Interactive and collaborative music performance on mobile devices , 2011 .

[4]  Max V. Mathews,et al.  A MARRIAGE OF THE DIRECTOR MUSICES PROGRAM AND THE CONDUCTOR PROGRAM , 2003 .

[5]  Søren Bech,et al.  Perceptual Audio Evaluation-Theory, Method and Application: Bech/Perceptual Audio Evaluation-Theory, Method and Application , 2006 .

[6]  Satoshi Usa,et al.  A conducting recognition system on the model of musicians' process , 1998 .

[7]  Giovanni De Poli,et al.  An Abstract Control Space for Communication of Sensory Expressive Intentions in Music Performance , 2003 .

[8]  Haruhiro Katayose,et al.  A New Music Database Describing Deviation Information of Performance Expressions , 2008, ISMIR.

[9]  J. Russell A circumplex model of affect. , 1980 .

[10]  I. Peretz Listen to the brain: A biological perspective on musical emotions. , 2001 .

[11]  Teresa Marrin Nakra,et al.  Inside the conductor's jacket: analysis, interpretation and musical synthesis of expressive gesture , 2000 .

[12]  Max V. Mathews,et al.  Current directions in computer music research , 1989 .

[13]  Teresa Marrin,et al.  Possibilities for the digital baton as a general-purpose gestural interface , 1997, CHI 1997.

[14]  Satoshi Usa,et al.  A Multi-modal Conducting Simulator , 1998, ICMC.

[15]  Johan Sundberg,et al.  Director musices : The KTH performance rules system , 2002 .

[16]  Simon Dixon,et al.  LIVE TRACKING OF MUSICAL PERFORMANCES USING ON-LINE TIME WARPING , 2005 .

[17]  Masataka Goto Active Music Listening Interfaces Based on Signal Processing , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[18]  Marco Fabiani,et al.  Interactive sonification of expressive hand gestures on a handheld device , 2011, Journal on Multimodal User Interfaces.

[19]  Axel Röbel,et al.  Analysis/synthesis comparison , 2000 .

[20]  Jakob Nielsen,et al.  Usability inspection methods , 1994, CHI 95 Conference Companion.

[21]  P. Juslin Communicating emotion in music performance: A review and a theoretical framework , 2001 .

[22]  Jean Laroche,et al.  Improved phase vocoder time-scale modification of audio , 1999, IEEE Trans. Speech Audio Process..

[23]  Max V. Mathews The radio baton and conductor program, or, pitch, the most important and least expressive part of music , 1991 .

[24]  Xavier Serra,et al.  SaxEx: a case-based reasoning system for generating expressive musical performances , 1998, ICMC.

[25]  Stefan Kopp,et al.  Gesture in embodied communication and human-computer interaction : 8th International Gesture Workshop, GW 2009, Bielefeld, Germany, February 25-27, 2009 : revised selected papers , 2010 .

[26]  Marco Fabiani A method for the modification of acoustic instrument tone dynamics , 2009 .

[27]  Graziano Bertini,et al.  The Light Baton: A System for Conducting Computer Music Performance , 1992, ICMC.

[28]  M. Csíkszentmihályi Finding Flow: The Psychology of Engagement with Everyday Life , 1997 .

[29]  Bernd Brügge,et al.  Pinocchio: conducting a virtual symphony orchestra , 2007, ACE '07.

[30]  Florian Wickelmaier,et al.  Perceptual Audio Evaluation - Theory, Method and Application , 2006 .

[31]  Gerhard Widmer,et al.  Production of staccato articulation in Mozart sonatas played on a grand piano. : Preliminary results , 2000 .

[32]  Lijuan Peng,et al.  A Gestural Interface for Orchestral Conducting Education , 2009, CSEDU.

[33]  Jyri Huopaniemi,et al.  Virtual Concerts in Virtual Spaces -- in Real Time (abstract) , 1999 .

[34]  Daniel Thalmann,et al.  Conducting a virtual orchestra , 2004, IEEE MultiMedia.

[35]  Antonio Camurri,et al.  Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library , 2003, Gesture Workshop.

[36]  Shuji Hashimoto,et al.  EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems , 2000, Computer Music Journal.

[37]  J. Sundberg,et al.  Overview of the KTH rule system for musical performance. , 2006 .

[38]  David A. Luce,et al.  Dynamic Spectrum Changes of Orchestral Instruments , 1975 .

[39]  Barry Vercoe,et al.  The Synthetic Performer in The Context of Live Performance , 1984, International Conference on Mathematics and Computing.

[40]  Yuri Ivanov,et al.  The UBS Virtual Maestro: an Interactive Conducting System , 2009, NIME.

[41]  Norbert Schnell,et al.  Continuous Realtime Gesture Following and Recognition , 2009, Gesture Workshop.

[42]  Max Mühlhäuser,et al.  Engineering a realistic real-time conducting system for the audio/video rendering of a real orchestra , 2002, Fourth International Symposium on Multimedia Software Engineering, 2002. Proceedings..

[43]  Roberto Bresin What is the Color of that Music Performance? , 2005, ICMC.

[44]  Anders Friberg A fuzzy analyzer of emotional expression in music performance and body motion , 2005 .

[45]  Gabriyel Wong,et al.  Virtual Orchestra: an immersive computer game for fun and education , 2006 .

[46]  Max Mühlhäuser,et al.  MICON: A Music Stand for Interactive Conducting , 2006, NIME.

[47]  Shuji Hashimoto,et al.  A computer music system that follows a human conductor , 1991, Computer.

[48]  Marcelo M. Wanderley,et al.  Recognition, Analysis and Performance with Expressive Conducting Gestures , 2004, ICMC.

[49]  Johan Sundberg,et al.  Generating Musical Performances with Director Musices , 2000, Computer Music Journal.

[50]  Roberto Bresin,et al.  Emotion rendering in music: Range and characteristic values of seven musical variables , 2011, Cortex.

[51]  Lonce L. Wyse,et al.  Real-Time Signal Estimation From Modified Short-Time Fourier Transform Magnitude Spectra , 2007, IEEE Transactions on Audio, Speech, and Language Processing.

[52]  Gerhard Widmer,et al.  Computational Models of Expressive Music Performance: The State of the Art , 2004 .

[53]  Antonio Camurri,et al.  Non-verbal Full Body Emotional and Social Interaction: A Case Study on Multimedia Systems for Active Music Listening , 2009, INTETAIN.

[54]  J. Sloboda,et al.  Music and emotion: Theory and research , 2001 .

[55]  Marco Fabiani FREQUENCY, PHASE AND AMPLITUDE ESTIMATION OF OVERLAPPING PARTIALS IN MONAURAL MUSICAL SIGNALS , 2010 .

[56]  Özgür Izmirli,et al.  Modeling the Tempo Coupling between an Ensemble and the Conductor , 2001, ICMC.

[57]  Jeffrey Rubin,et al.  Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests , 1994 .

[58]  Johan Sundberg,et al.  Musical punctuation on the microlevel : Automatic identification and performance of small melodic units , 1998 .

[59]  Jan O. Borchers,et al.  PhaVoRIT: A Phase Vocoder for Real-Time Interactive Time-Stretching , 2006, ICMC.

[60]  Jakub Segen,et al.  Visual interface for conducting virtual orchestra , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[61]  Roger B. Dannenberg,et al.  An On-Line Algorithm for Real-Time Accompaniment , 1984, ICMC.

[62]  Roberto Bresin,et al.  Articulation Strategies in Expressive Piano Performance Analysis of Legato, Staccato, and Repeated Notes in Performances of the Andante Movement of Mozart’s Sonata in G Major (K 545) , 2000 .

[63]  R. Parncutt,et al.  The Science and Psychology of Music Performance Creative Strategies for Teaching and Learning , 2002 .

[64]  Jan Borchers,et al.  Toward a Framework for Interactive Systems to Conduct Digital Audio and Video Streams , 2006 .

[65]  Anders Friberg,et al.  Emotional Coloring of Computer-Controlled Music Performances , 2000, Computer Music Journal.

[66]  Max Mühlhäuser,et al.  Personal orchestra: a real-time audio/video system for interactive conducting , 2003, Multimedia Systems.

[67]  Anders Friberg,et al.  Home conducting - control the Overall Musical expression with gestures , 2005, ICMC.

[68]  Antonio Camurri,et al.  Gesture-Based Communication in Human-Computer Interaction , 2003, Lecture Notes in Computer Science.

[69]  Guy E. Garnett,et al.  Technological Advances for Conducting a Virtual Ensemble , 2001, ICMC.

[70]  Jan O. Borchers,et al.  conga: A Framework for Adaptive Conducting Gesture Analysis , 2006, NIME.

[71]  Johan Sundberg,et al.  How can music be expressive? , 1993, Speech Commun..

[72]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[73]  Anders Friberg Generative Rules for Music Performance : A Formal Description of a Rule System , 1991 .

[74]  T. M. Nakra,et al.  Synthesizing Expressive Music Through the Language of Conducting , 2002 .

[75]  Paul Modler,et al.  An Experimental Set of Hand Gestures for Expressive Control of Musical Parameters in Realtime , 2003, NIME.

[76]  Declan Murphy,et al.  Conducting Audio Files via Computer Vision , 2003, Gesture Workshop.

[77]  William Buxton,et al.  A Microcomputer-Based Conducting System , 1980 .

[78]  Haruhiro Katayose,et al.  Mixtract: A directable musical expression system , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[79]  Haruhiro Katayose,et al.  jPop-E: an assistant system for performance rendering of ensemble music , 2007, NIME '07.

[80]  Anders Friberg pDM: An Expressive Sequencer with Real-Time Control of the KTH Music-Performance Rules , 2006 .

[81]  Max Mühlhäuser,et al.  Gyroscope-Based Conducting Gesture Recognition , 2009, NIME.

[82]  Haruhiro Katayose,et al.  "VirtualPhilharmony": A Conducting System with Heuristics of Conducting an Orchestra , 2010, NIME.

[83]  Johan Sundberg,et al.  Musical Performance: A Synthesis-by-Rule Approach , 1983 .

[84]  Haruhiro Katayose,et al.  iFP: A Music Interface Using an Expressive Performance Template , 2004, ICEC.

[85]  Max V. Mathews,et al.  The conductor program and mechanical baton , 1988 .

[86]  Andy Hunt,et al.  Multiple media interfaces for music therapy , 2004, IEEE MultiMedia.

[87]  Jan O. Borchers,et al.  iSymphony: an adaptive interactive orchestral conducting system for digital audio and video streams , 2006, CHI EA '06.

[88]  Max Mühlhäuser,et al.  Conducting a realistic electronic orchestra , 2001, UIST '01.

[89]  Maurizio Mancini,et al.  A Virtual Head Driven by Music Expressivity , 2007, IEEE Transactions on Audio, Speech, and Language Processing.