Rhythmic Pattern Modeling for Beat and Downbeat Tracking in Musical Audio

Rhythmic patterns are an important structural element in music. This paper investigates the use of rhythmic pattern modeling to infer metrical structure in musical audio recordings. We present a Hidden Markov Model (HMM) based system that simultaneously extracts beats, downbeats, tempo, meter, and rhythmic patterns. Our model builds upon the basic structure proposed by Whiteley et. al [20], which we further modified by introducing a new observation model: rhythmic patterns are learned directly from data, which makes the model adaptable to the rhythmical structure of any kind of music. For learning rhythmic patterns and evaluating beat and downbeat tracking, 697 ballroom dance pieces were annotated with beat and measure information. The results showed that explicitly modeling rhythmic patterns of dance styles drastically reduces octave errors (detection of half or double tempo) and substantially improves downbeat tracking.

[1]  Jaakko Astola,et al.  Analysis of the meter of acoustic musical signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[2]  A. Tamhane,et al.  Multiple Comparison Procedures , 2009 .

[3]  A. Tamhane,et al.  Multiple Comparison Procedures , 1989 .

[4]  Björn W. Schuller,et al.  Wearable Assistance for the Ballroom-Dance Hobbyist - Holistic Rhythm Analysis and Dance-Style Classification , 2007, 2007 IEEE International Conference on Multimedia and Expo.

[5]  Gerhard Widmer,et al.  Towards Characterisation of Music via Rhythmic Patterns , 2004, ISMIR.

[6]  Masataka Goto,et al.  An Audio-based Real-time Beat Tracking System for Music With or Without Drum-sounds , 2001 .

[7]  D. Ellis Beat Tracking by Dynamic Programming , 2007 .

[8]  P. Manuel,et al.  The Anticipated Bass in Cuban Popular Music , 1985 .

[9]  Florian Krebs,et al.  Evaluating the Online Capabilities of Onset Detection Methods , 2012, ISMIR.

[10]  Matthew E. P. Davies,et al.  Selective Sampling for Beat Tracking Evaluation , 2012, IEEE Transactions on Audio, Speech, and Language Processing.

[11]  Orberto,et al.  Evaluation Methods for Musical Audio Beat Tracking Algorithms , 2009 .

[12]  Geoffroy Peeters,et al.  Simultaneous Beat and Downbeat-Tracking Using a Probabilistic Framework: Theory and Large-Scale Evaluation , 2011, IEEE Transactions on Audio, Speech, and Language Processing.

[13]  Matthew E. P. Davies,et al.  Reliability-Informed Beat Tracking of Musical Signals , 2012, IEEE Transactions on Audio, Speech, and Language Processing.

[14]  Markus Schedl,et al.  ENHANCED BEAT TRACKING WITH CONTEXT-AWARE NEURAL NETWORKS , 2011 .

[15]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[16]  Simon J. Godsill,et al.  Bayesian Modelling of Temporal Structure in Musical Audio , 2006, ISMIR.

[17]  Malcolm D. Macleod,et al.  Particle Filtering Applied to Musical Tempo Tracking , 2004, EURASIP J. Adv. Signal Process..

[18]  Stuart J. Russell,et al.  Dynamic bayesian networks: representation, inference and learning , 2002 .

[19]  Matthew E. P. Davies,et al.  One in the Jungle: Downbeat Detection in Hardcore, Jungle, and Drum and Bass , 2012, ISMIR.

[20]  Peter Knees,et al.  On Rhythm and General Music Similarity , 2009, ISMIR.

[21]  Matthew E. P. Davies,et al.  Context-Dependent Beat Tracking of Musical Audio , 2007, IEEE Transactions on Audio, Speech, and Language Processing.