GTZAN-Rhythm: Extending the GTZAN Test-Set with Beat, Downbeat and Swing Annotations
暂无分享,去创建一个
[1] Mark Levy. Improving Perceptual Tempo Estimation with Crowd-Sourced Annotations , 2011, ISMIR.
[2] Oriol Nieto,et al. JAMS: A JSON Annotated Music Specification for Reproducible MIR Research , 2014, ISMIR.
[3] Matthew E. P. Davies,et al. One in the Jungle: Downbeat Detection in Hardcore, Jungle, and Drum and Bass , 2012, ISMIR.
[4] Bob L. Sturm. The GTZAN dataset: Its contents, its faults, their effects on evaluation, and its future use , 2013, ArXiv.
[5] Geoffroy Peeters,et al. Swing Ratio Estimation , 2015 .
[6] George Tzanetakis,et al. Musical genre classification of audio signals , 2002, IEEE Trans. Speech Audio Process..
[7] Matthew E. P. Davies,et al. Selective Sampling for Beat Tracking Evaluation , 2012, IEEE Transactions on Audio, Speech, and Language Processing.
[8] Orberto,et al. Evaluation Methods for Musical Audio Beat Tracking Algorithms , 2009 .
[9] Geoffroy Peeters,et al. Simultaneous Beat and Downbeat-Tracking Using a Probabilistic Framework: Theory and Large-Scale Evaluation , 2011, IEEE Transactions on Audio, Speech, and Language Processing.
[10] Karën Fort,et al. Towards a (Better) Definition of the Description of Annotated MIR Corpora , 2012, ISMIR.
[11] Florian Krebs,et al. Rhythmic Pattern Modeling for Beat and Downbeat Tracking in Musical Audio , 2013, ISMIR.
[12] Jaakko Astola,et al. Analysis of the meter of acoustic musical signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.
[13] Masataka Goto,et al. AIST Annotation for the RWC Music Database , 2006, ISMIR.