What Makes Beat Tracking Difficult? A Case Study on Chopin Mazurkas

The automated extraction of tempo and beat information from music recordings is a challenging task. Especially in the case of expressive performances, current beat tracking approaches still have significant problems to accurately capture local tempo deviations and beat positions. In this paper, we introduce a novel evaluation framework for detecting critical passages in a piece of music that are prone to tracking errors. Our idea is to look for consistencies in the beat tracking results over multiple performances of the same underlying piece. As another contribution, we further classify the critical passages by specifying musical properties of certain beats that frequently evoke tracking errors. Finally, considering three conceptually different beat tracking procedures, we conduct a case study on the basis of a challenging test set that consists of a variety of piano performances of Chopin Mazurkas. Our experimental results not only make the limitations of state-of-the-art beat trackers explicit but also deepens the understanding of the underlying music material.

[1]  George Tzanetakis,et al.  An experimental comparison of audio tempo induction algorithms , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[2]  Matthew E. P. Davies,et al.  Evaluation of Audio Beat Tracking and Music Tempo Extraction Algorithms , 2007 .

[3]  Andrew Earis An Algorithm to Extract Expressive Timing and Dynamics from Piano Recordings , 2007 .

[4]  Matthew E. P. Davies,et al.  Context-Dependent Beat Tracking of Musical Audio , 2007, IEEE Transactions on Audio, Speech, and Language Processing.

[5]  Peter Grosche,et al.  A Mid-Level Representation for Capturing Dominant Tempo and Pulse Information in Music Recordings , 2009, ISMIR.

[6]  Geoffroy Peeters,et al.  Template-Based Estimation of Time-Varying Tempo , 2007, EURASIP J. Adv. Signal Process..

[7]  Jaakko Astola,et al.  Analysis of the meter of acoustic musical signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[8]  Orberto,et al.  Evaluation Methods for Musical Audio Beat Tracking Algorithms , 2009 .

[9]  Simon Dixon An Empirical Comparison of Tempo Trackers , 2001 .

[10]  Peter Grosche,et al.  High resolution audio synchronization using chroma onset features , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[11]  Mark B. Sandler,et al.  A tutorial on onset detection in music signals , 2005, IEEE Transactions on Speech and Audio Processing.

[12]  Craig Stuart Sapp Hybrid Numeric/Rank Similarity Metrics for Musical Performance Analysis , 2008, ISMIR.

[13]  S. Dixon,et al.  PINPOINTING THE BEAT: TAPPING TO EXPRESSIVE PERFORMANCES , 2002 .