Visualizing Expressive Performance in Tempo—Loudness Space

Computer Music Journal, 27:4, pp. 69–83, Winter 2003 q 2003 Massachusetts Institute of Technology. The previous decades of performance research have yielded a large number of very detailed studies analyzing various parameters of expressive music performance (see Palmer 1997 and Gabrielsson 1999 for an overview). A special focus was given to expressive piano performance, because the expressive parameters are relatively few (timing, dynamics, and articulation, including pedaling) and comparatively easy to obtain. The majority of performance studies concentrated on one of these parameters exclusively, and in most of these cases, this parameter was expressive timing. In our everyday experience, we never listen to one of these parameters in isolation as it is analyzed in performance research. Certainly, the listener’s attention can be guided sometimes more to one particular parameter (e.g., the forced stable tempo in a ProkoŽ eff Toccata or the staccato– legato alternation in a Mozart Allegro), but generally the aesthetic impression of a performance results from an integrated perception of all performance parameters and is in uenced by other factors like body movements and the socio-cultural background of a performer or a performance as well. It can be presumed that the different performance parameters in uence and depend on each other in various and intricate ways. (For example, Todd 1992 and Juslin, Friberg, and Bresin 2002 provide modeling-based approaches.) Novel research methods could help us to analyze expressive music performances in a more holistic way to tackle these questions. Another problem of performance analysis is the enormously large amounts of information the researcher must deal with, even when investigating, for example, only the timing of a few bars of a single piece. In general, it remains unclear whether the expressive deviations measured are due to deliberate expressive strategies, musical structure, motor noise, imprecision of the performer, or even measurement errors. In the present article, we develop an integrated analysis technique in which tempo and loudness are processed and displayed at the same time. Both the tempo and loudness curves are smoothed with a window size corresponding ideally to the length of a bar. These two performance parameters are then displayed in a two-dimensional performance space on a computer screen: a dot moves in synchrony with the sound of the performance. The trajectory of its tail describes geometric shapes that are intrinsically different for different performances. Such an animated display seems to be a useful visualization tool for performance research. The simultaneous display of tempo and loudness allows us to study interactions between these two parameters by themselves or with respect to properties of the musical score. The behavior of the algorithm and insights provided by this type of display are illustrated with performances of two musical excerpts by Chopin and Schubert. In the Ž rst case study, two expert performances and a professional recording by Maurizio Pollini are compared; in the second case study, an algorithmic performance according to a basic performance model is contrasted by Alfred Brendel’s performance of the same excerpt. These two excerpts were chosen because articulation is constant throughout the whole excerpt (legato), and analysis can concentrate on tempo and dynamics.

[1]  Alexander Truslit Gestaltung und Bewegung in der Musik , 1938 .

[2]  piano Barbara Scheidker,et al.  Sonata in A Major, K. 331 , 1979 .

[3]  R. Jackendoff,et al.  A Generative Theory of Tonal Music , 1985 .

[4]  Robert Rowe,et al.  Action and Perception in Rhythm and Music , 1989 .

[5]  Hugo Fastl,et al.  Psychoacoustics: Facts and Models , 1990 .

[6]  Nell P. McAngusTodd,et al.  The dynamics of dynamics: A model of musical expression , 1992 .

[7]  Bruno H. Repp Music as Motion: A Synopsis of Alexander Truslit's (1938) Gestaltung und Bewegung in der Musik , 1993 .

[8]  W. L. Windsor,et al.  Expressive Timing and Dynamics in Real and Artificial Musical Performances: Using an Algorithm as an Analytical Tool , 1997 .

[9]  C. Palmer Music performance. , 1997, Annual review of psychology.

[10]  Reinhard Kopiez,et al.  Controlling Creative Processes in Music , 1998 .

[11]  Hugo Fastl,et al.  Psychoacoustics Facts and Models. 2nd updated edition , 1999 .

[12]  Emery Schubert Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space , 1999 .

[13]  Carlo Drioli,et al.  Audio Morphing Different Expressive Intentions for Multimedia Systems , 2000, IEEE Multim..

[14]  Luke Windsor,et al.  Make Me a Match: An Evaluation of Different Approaches to ScorePerformance Matching , 2000, Computer Music Journal.

[15]  Reinhard Kopiez,et al.  REALTIME ANALYSIS OF DYNAMIC SHAPING , 2000 .

[16]  G. Widmer,et al.  Human Preferences of Tempo Smoothness. , 2001 .

[17]  Gerhard Widmer,et al.  Using AI and machine learning to study expressive music performance: project survey and first report , 2001, AI Commun..

[18]  Laboratorio Nacional de Música Electroacústica Proceedings of the 2001 International Computer Music Conference, ICMC 2001, Havana, Cuba, September 17-22, 2001 , 2001, ICMC.

[19]  Simon Dixon,et al.  Automatic Extraction of Tempo and Beat From Expressive Performances , 2001 .

[20]  P. Juslin,et al.  Toward a computational model of expression in music performance: The GERM model , 2001 .

[21]  Roberto Bresin,et al.  Are computer-controlled pianos a reliable tool in music performance research? Recording and reproduction precision of a Yamaha Disklavier grand piano , 2001 .

[22]  S. Dixon,et al.  Analysis of tempo classes in performances of Mozart sonatas , 2001 .

[23]  Simon Dixon An Interactive Beat Tracking and Visualisation System , 2001, ICMC.

[24]  W Goebl,et al.  Melody lead in piano performance: expressive device or artifact? , 2001, The Journal of the Acoustical Society of America.

[25]  Werner Goebl,et al.  Representing expressive performance in tempo-loudness space , 2002 .

[26]  Elias Pampalk,et al.  Content-based organization and visualization of music archives , 2002, MULTIMEDIA '02.

[27]  Gerhard Widmer,et al.  Real Time Tracking and Visualisation of Musical Expression , 2002, ICMAI.

[28]  S. Dixon,et al.  PINPOINTING THE BEAT: TAPPING TO EXPRESSIVE PERFORMANCES , 2002 .

[29]  Gerhard Widmer,et al.  A new approach to hierarchical clustering and structuring of data with Self-Organizing Maps , 2004, Intell. Data Anal..