How do you feel about "dancing queen"?: deriving mood & theme annotations from user tags

Web 2.0 enables information sharing, collaboration among users and most notably supports active participation and creativity of the users. As a result, a huge amount of manually created metadata describing all kinds of resources is now available. Such semantically rich user generated annotations are especially valuable for digital libraries covering multimedia resources such as music, where these metadata enable retrieval relying not only on content-based (low level) features, but also on the textual descriptions represented by tags. However, if we analyze the annotations users generate for music tracks, we find them heavily biased towards genre. Previous work investigating the types of user provided annotations for music tracks showed that the types of tags which would be really beneficial for supporting retrieval - usage (theme) and opinion (mood) tags - are often neglected by users in the annotation rocess. In this paper we address exactly this problem: in order to support users in tagging and to fill these gaps in the tag space, we develop algorithms for recommending mood and theme annotations. Our methods exploit the available user annotations, the lyrics of music tracks, as well as combinations of both. We also compare the results for our recommended mood / theme annotations against genre and style recommendations - a much easier and already studied task. Besides evaluating against an expert (AllMusic.com) ground truth, we evaluate the quality of our recommended tags through a Facebook-based user study. Our results are very promising both in comparison to experts as well as users and provide interesting insights into possible extensions for music tagging systems to support music search.

[1]  Wolfgang Nejdl,et al.  The Benefit of Using Tag-Based Profiles , 2007, 2007 Latin American Web Conference (LA-WEB 2007).

[2]  Mor Naaman,et al.  Towards automatic extraction of event and place semantics from flickr tags , 2007, SIGIR.

[3]  Luc Steels,et al.  Integrating Collaborative Tagging and Emergent Semantics for Image Retrieval , 2006 .

[4]  Yueting Zhuang,et al.  Popular music retrieval by detecting mood , 2003, SIGIR.

[5]  Raluca Paiu,et al.  Deriving music theme annotations from user tags , 2009, WWW '09.

[6]  Hector Garcia-Molina,et al.  Social tag prediction , 2008, SIGIR '08.

[7]  John Riedl,et al.  tagging, communities, vocabulary, evolution , 2006, CSCW '06.

[8]  Mor Naaman,et al.  ZoneTag's Collaborative Tag Suggestions: What is This Person Doing in My Phone? , 2008, IEEE MultiMedia.

[9]  Gilad Mishne,et al.  AutoTag: a collaborative approach to automated tag assignment for weblog posts , 2006, WWW '06.

[10]  Valentin Robu,et al.  The complex dynamics of collaborative tagging , 2007, WWW '07.

[11]  J. Stephen Downie,et al.  Exploring Mood Metadata: Relationships with Genre, Artist and Usage Metadata , 2007, ISMIR.

[12]  Wolfgang Nejdl,et al.  Can all tags be used for search? , 2008, CIKM '08.

[13]  Xin Li,et al.  Tag-based social interest discovery , 2008, WWW.

[14]  Lawrence Birnbaum,et al.  TagAssist: Automatic Tag Suggestion for Blog Posts , 2007, ICWSM.

[15]  Lie Lu,et al.  Automatic mood detection from acoustic music data , 2003, ISMIR.

[16]  Tao Li,et al.  A comparative study on content-based music genre classification , 2003, SIGIR.

[17]  Roelof van Zwol,et al.  Flickr tag recommendation based on collective knowledge , 2008, WWW.

[18]  Janto Skowronek,et al.  A Demonstrator for Automatic Music Mood Estimation , 2007, ISMIR.

[19]  Hui Wan,et al.  Personalized Tag Recommendations via Tagging and Content-based Similarity Metrics , 2007, ICWSM.

[20]  P. Shaver,et al.  Emotion knowledge: further exploration of a prototype approach. , 1987, Journal of personality and social psychology.