Metadata research for music digital libraries has traditionally focused on genre. Despite its potential for improving the ability of users to better search and browse music collections, music subject metadata is an unexplored area. The objective of this study is to expand the scope of music metadata research, in particular, by exploring music subject classification based on user interpretations of music. Furthermore, we compare this previously unexplored form of user data to lyrics at subject prediction tasks. In our experiment, we use datasets consisting of 900 songs annotated with user interpretations. To determine the significance of performance differences between the two sources, we applied Friedman's ANOVA test on the classification accuracies. The results show that user-generated interpretations are significantly more useful than lyrics as classification features (p <; 0.05). The findings support the possibility of exploiting various existing sources for subject metadata enrichment in music digital libraries.
[1]
J. Stephen Downie,et al.
Survey Of Music Information Needs, Uses, And Seeking Behaviours: Preliminary Findings
,
2004,
ISMIR.
[2]
Peter Knees,et al.
Oh Oh Oh Whoah! Towards Automatic Topic Detection In Song Lyrics
,
2008,
ISMIR.
[3]
J. Stephen Downie,et al.
The music information retrieval evaluation exchange (2005-2007): A window into music information retrieval research
,
2008,
Acoustical Science and Technology.
[4]
Donald Byrd,et al.
Problems of music information retrieval in the real world
,
2002,
Inf. Process. Manag..
[5]
Andreas F. Ehmann,et al.
Mining Music Reviews: Promising Preliminary Results
,
2005,
ISMIR.