Synthesizing Expressive Music Through the Language of Conducting

This article presents several novel methods that have been developed to interpret and synthesize music to accompany conducting gestures. The central technology used in this project is the Conductor's Jacket, a sensor interface that gathers its wearer's gestures and physiology. A bank of software filters extracts numerous features from the sensor signals; these features then generate real-time expressive effects by shaping the note onsets, tempos, articulations, dynamics, and note lengths in a musical score. The result is a flexible, expressive, real-time musical response. This article features the Conductor's Jacket software system and describes in detail its architecture, algorithms, implementation issues, and resulting musical compositions.

[1]  Jennifer Healey,et al.  Affective wearables , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[2]  Elizabeth A. H. Green The Modern Conductor , 1961 .

[3]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[4]  Gerhart Harrer,et al.  Grundlagen der Musiktherapie und Musikpsychologie , 1975 .

[5]  Teresa Marrin Nakra,et al.  The "Conductor's Jacket": A Device for Recording Expressive Musical Gestures , 1998, ICMC.

[6]  Johan Sundberg,et al.  Musical punctuation on the microlevel : Automatic identification and performance of small melodic units , 1998 .

[7]  David Rothenberg,et al.  Sudden music: Improvising across the electronic abyss , 1996 .

[8]  Nell P. McAngusTodd,et al.  The dynamics of dynamics: A model of musical expression , 1992 .

[9]  Rudolf Stephan,et al.  The New Harvard Dictionary of Music , 1988 .

[10]  D. Polan,et al.  Noise: The Political Economy of Music , 1989 .

[11]  M. A. Norris,et al.  Using State-of-the-Art Technologies to Investigate the Cognitive Mapping of Musical Intention to Performance , 1991, ICMC.

[12]  Manfred Clynes,et al.  Some Guidelines for the Synthesis and Testing of Pulse Microstructure in Relation to Musical Meaning , 1990 .

[13]  Joseph Barnby,et al.  The Art of Conducting , 1892 .

[14]  Teresa Marrin,et al.  Analysis of Affective Musical Expression With the Conductors Jacket , 2002 .

[15]  Robert A. Boie,et al.  The Radio Drum as a Synthesizer Controller , 1989, ICMC.

[16]  Rosalind W. Picard,et al.  Signal processing for recognition of human frustration , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[17]  J. Attali,et al.  Noise: The Political Economy of Music , 1989 .

[18]  Teresa Marrin Toward an understanding of musical gesture : mapping expressive intention with the digital baton , 1996 .

[19]  Joseph T. Chung,et al.  Hyperinstruments: Musically Intelligent and Interactive Performance and Creativity Systems , 1989, ICMC.

[20]  D. Epstein,et al.  Shaping Time: Music, the Brain, and Performance , 1995 .

[21]  Rolf Gehlhaar SOUND = SPACE: an interactive musical environment , 1991 .

[22]  Bert Bongers,et al.  An Interview with Sensorband , 1998 .

[23]  Eric D. Scheirer,et al.  Tempo and beat analysis of acoustic musical signals. , 1998, The Journal of the Acoustical Society of America.

[24]  Aaron F. Bobick,et al.  Nonlinear PHMMs for the interpretation of parameterized gesture , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[25]  Joseph A. Paradiso,et al.  Musical Applications of Electric Field Sensing , 1997 .

[26]  Paul Van Bodegraven,et al.  The Modern Conductor , 1962 .

[27]  Roger Sessions,et al.  The musical experience of composer, performer, listener , 1950 .

[28]  Adrian Boult,et al.  A handbook on the technique of conducting , 1937 .

[29]  Max V. Mathews,et al.  The conductor program and mechanical baton , 1988 .

[30]  Michael Hawley Structure out of sound , 1993 .

[31]  Max V. Mathews,et al.  Current directions in computer music research , 1989 .

[32]  Takahiro Watanabe,et al.  Real time gesture recognition using eigenspace from multi-input image sequences , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[33]  Satoshi Usa,et al.  A conducting recognition system on the model of musicians' process , 1998 .

[34]  Teresa Marrin Nakra,et al.  Inside the conductor's jacket: analysis, interpretation and musical synthesis of expressive gesture , 2000 .

[35]  Joel Ryan Some remarks on musical instrument design at STEIM , 1991 .

[36]  Alexandra Goho A soft touch: Imaging technique reveals hidden atoms , 2003 .

[37]  Jennifer Healey,et al.  Digital processing of affective signals , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[38]  Joseph A. Paradiso,et al.  Instrumented Footwear for Interactive Dance , 1998 .

[39]  D. Randel,et al.  The New Harvard Dictionary of Music , 1988 .

[40]  Peter Desain,et al.  Structural Expression Component Theory (SECT), and a method for decomposing expression in music performance , 1997 .

[41]  Jane Zageris,et al.  The soft touch , 1979, British Dental Journal.

[42]  Robert Rowe,et al.  Incrementally improving interactive music systems , 1996 .

[43]  Manfred Clynes,et al.  Microstructural musical linguistics: composers' pulses are liked most by the best musicians , 1995, Cognition.

[44]  Hermann Scherchen,et al.  Handbook of conducting , 1933 .