Music identification is an effective tool that enables multimedia players to extract a distinct statistical digest of the played content, look up into a music database using the extracted unique identifier, and then take advantage of the services available for that particular content. In this paper, we introduce beat-IDs, the first music identification system that creates the digest of the music clip by understanding the basic structure of every musical piece: its beat. A beat-ID is created in two steps: first, the system detects the average beat period of a given music clip using a modified EM algorithm and then, it analyzes the statistical properties of the clip with respect to the detected beats. The extracted 32-byte beat-ID contains two components: the length of the average beat period and a compressed statistical digest of signal's energy distribution in an average beat period. Finally, we introduce an algorithm for matching beat-IDs that quantifies the matching accuracy between two music identifiers using an error analysis. In this paper, the properties of beat-IDs are demonstrated using a relatively small database of audio clips.
[1]
John C. Platt,et al.
Extracting noise-robust features from audio data
,
2002,
2002 IEEE International Conference on Acoustics, Speech, and Signal Processing.
[2]
Jaap A. Haitsma,et al.
Robust Audio Hashing for Content Identification
,
2001
.
[3]
Brendan J. Frey,et al.
Fast, Large-Scale Transformation-Invariant Clustering
,
2001,
NIPS.
[4]
D. Rubin,et al.
Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper
,
1977
.
[5]
Henrique S. Malvar.
Auditory masking in audio compression
,
2001
.