Beat Tracking with Musical Knowledge

When a person taps a foot in time with a piece of music, they are performing beat tracking. Beat tracking is fundamental to the understanding of musical structure, and therefore an essential ability for any system which purports to exhibit musical intelligence or understanding. We present an off-line multiple agent beat tracking system which estimates the locations of musical beats in MIDI performance data. This approach to beat tracking requires no prior information about the input data, such as the tempo or time signature; all required information is derived from the performance data. For constant tempo performances, previous beat tracking systems have proved successful; however, these systems fail when there are large variations in tempo. We examine the role of musical knowledge in guiding the beat tracking process, and show that a system equipped with knowledge of musical salience is able to track the beat of music even in the presence of large tempo variations. Results are presented for a large corpus of expressively performed classical piano music (13 complete sonatas), containing a full range of tempos and much variability in tempo within sections. With the musical knowledge disabled, the beats are tracked about 75% correctly; the inclusion of musical knowledge raises this figure to over 90%.

[1]  H. C. Longuet-Higgins,et al.  The Rhythmic Interpretation of Monophonic Music , 1984 .

[2]  Simon Dixon Beat Induction and Rhythm Recognition , 1997, Australian Joint Conference on Artificial Intelligence.

[3]  S. Handel Listening As Introduction to the Perception of Auditory Events , 1989 .

[4]  Yoichi Muraoka,et al.  A Real-Time Beat Tracking System for Audio Signals , 1996, ICMC.

[5]  Yoichi Muraoka,et al.  Real-time Rhythm Tracking for Drumless Audio Signals --- Chord Change Detection for Musical Decision , 1997, International Joint Conference on Artificial Intelligence.

[6]  Masataka Goto,et al.  Ijcai-97 Workshop on Computational Auditory Scene Analysis Real-time Rhythm Tracking for Drumless Audio Signals | Chord Change Detection for Musical Decisions | , 1997 .

[7]  Simon Dixon,et al.  A Lightweight Multi-agent Musical Beat Tracking System , 2000, PRICAI.

[8]  Peter Desain,et al.  A connectionist and a traditional AI quantizer, symbolic versus sub-symbolic models of rhythm perception , 1993 .

[9]  S. Handel,et al.  Listening: An Introduction to the Perception of Auditory Events , 1993 .

[10]  Robert Rowe,et al.  Machine Listening and Composing with Cypher , 1992 .

[11]  Célestin Deliège À propos de l’ouvrage de Lerdahl et Jackendoff : « A Generative Theory of Tonal Music » , 1983 .

[12]  H C Longuet-Higgins,et al.  The Perception of Musical Rhythms , 1982, Perception.

[13]  Peter Desain,et al.  Quantization of musical time: a connectionist approach , 1989 .

[14]  Sandip Sen,et al.  Adaption and Learning in Multi-Agent Systems: Ijcai'95 Workshop, Montreal, Canada, August 21, 1995, Proceedings , 1996 .

[15]  Simon Dixon,et al.  A Beat Tracking System for Audio Signals , 1999 .

[16]  Peter Desain,et al.  Computational models of beat induction: the rule-based approach , 1999 .

[17]  R. Jackendoff,et al.  A Generative Theory of Tonal Music , 1985 .

[18]  David Rosenthal,et al.  Emulation of human rhythm perception , 1992 .

[19]  Yoichi Muraoka,et al.  An Audio-based Real-time Beat Tracking System and Its Applications , 1998, ICMC.

[20]  Yoichi Muraoka,et al.  Real-time beat tracking for drumless audio signals: Chord change detection for musical decisions , 1999, Speech Commun..