Music Mood and Theme Classification - a Hybrid Approach

Music perception is highly intertwined with both emotions and context. Not surprisingly, many of the users’ information seeking actions aim at retrieving music songs based on these perceptual dimensions ‐ moods and themes, expressing how people feel about music or which situations they associate it with. In order to successfully support music retrieval along these dimensions, powerful methods are needed. Still, most existing approaches aiming at inferring some of the songs’ latent characteristics focus on identifying musical genres. In this paper we aim at bridging this gap between users’ information needs and indexed music features by developing algorithms for classifying music songs by moods and themes. We extend existing approaches by also considering the songs’ thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks. Our methods exploit both audio features and collaborative user annotations, fusing them to improve overall performance. Evaluation performed against the AllMusic.com ground truth shows that both kinds of information are complementary and should be merged for enhanced classification accuracy.

[1]  R. Thayer The biopsychology of mood and arousal , 1989 .

[2]  J. Stephen Downie,et al.  Survey Of Music Information Needs, Uses, And Seeking Behaviours: Preliminary Findings , 2004, ISMIR.

[3]  Wolfgang Nejdl,et al.  Can all tags be used for search? , 2008, CIKM '08.

[4]  Wolfgang Nejdl,et al.  How do you feel about "dancing queen"?: deriving mood & theme annotations from user tags , 2009, JCDL '09.

[5]  Lie Lu,et al.  Automatic mood detection from acoustic music data , 2003, ISMIR.

[6]  Chih-Jen Lin,et al.  Probability Estimates for Multi-class Classification by Pairwise Coupling , 2003, J. Mach. Learn. Res..

[7]  Janto Skowronek,et al.  A Demonstrator for Automatic Music Mood Estimation , 2007, ISMIR.

[8]  Panagiotis Symeonidis,et al.  Ternary Semantic Analysis of Social Tags for Personalized Music Recommendation , 2008, ISMIR.

[9]  Gert R. G. Lanckriet,et al.  Modeling music and words using a multi-class naïve Bayes approach , 2006, ISMIR.

[10]  Michael I. Mandel,et al.  AUDIO MUSIC MOOD CLASSIFICATION USING SUPPORT VECTOR MACHINE , 2007 .

[11]  J. Stephen Downie,et al.  Exploring Mood Metadata: Relationships with Genre, Artist and Usage Metadata , 2007, ISMIR.

[12]  Perfecto Herrera,et al.  Automatic Detection of Emotion in Music: Interaction with Emotionally Sensitive Machines , 2018 .

[13]  Thierry Bertin-Mahieux,et al.  Autotagger: A Model for Predicting Social Tags from Acoustic Features on Large Music Databases , 2008 .

[14]  Thierry Bertin-Mahieux,et al.  Automatic Generation of Social Tags for Music Recommendation , 2007, NIPS.

[15]  Mark B. Sandler,et al.  A Semantic Space for Music Derived from Social Tags , 2007, ISMIR.

[16]  Wolfgang Nejdl,et al.  Improving music genre classification using collaborative tagging data , 2009, WSDM '09.

[17]  Yi-Hsuan Yang,et al.  A Regression Approach to Music Emotion Recognition , 2008, IEEE Transactions on Audio, Speech, and Language Processing.

[18]  Gert R. G. Lanckriet,et al.  Five Approaches to Collecting Tags for Music , 2008, ISMIR.

[19]  Yueting Zhuang,et al.  Popular music retrieval by detecting mood , 2003, SIGIR.