A mood-based music classification and exploration system

Mood classification of music is an emerging domain of music information retrieval. In the approach presented here features extracted from an audio file are used in combination with the affective value of song lyrics to map a song onto a psychologically based emotion space. The motivation behind this system is the lack of intuitive and contextually aware playlist generation tools available to music listeners. The need for such tools is made obvious by the fact that digital music libraries are constantly expanding, thus making it increasingly difficult to recall a particular song in the library or to create a playlist for a specific event. By combining audio content information with context-aware data, such as song lyrics, this system allows the listener to automatically generate a playlist to suit their current activity or mood. Thesis Supervisor: Barry Vercoe Title: Professor of Media Arts and Sciences, Program in Media Arts and Sciences A Mood-Based Music Classification and Exploration System by Owen Craigie Meyers The following people served as readers for this thesis: Thesis Reader Henry Lieberman Research Scientist MIT Media Laboratory Thesis Reader. Emery Schubert Australian Research Fellow & Lecturer University of New South Wales -..--. ..M I..N-. .-& --. -. --. . ----

[1]  Kate Hevner,et al.  The affective character of the major and minor mode in music , 1935 .

[2]  Ralph H. Gundlach,et al.  Factors Determining the Characterization of Musical Phrases , 1935 .

[3]  K. Hevner Expression in music: a discussion of experimental studies and theories. , 1935 .

[4]  K. Hevner Experimental studies of the elements of expression in music , 1936 .

[5]  K. Hevner,et al.  The affective value of pitch and tempo in music. , 1937 .

[6]  K. Watson The nature and measurement of musical meanings , 1942 .

[7]  P. Farnsworth A STUDY OF THE HEVNER ADJECTIVE LIST , 1954 .

[8]  M. Rigg The Mood Effects of Music: A Comparison of Data from Four Investigators , 1964 .

[9]  J. Russell Affective space is bipolar. , 1979 .

[10]  J. Russell A circumplex model of affect. , 1980 .

[11]  P. Kleinginna,et al.  A categorized list of emotion definitions, with suggestions for a consensual definition , 1981 .

[12]  J. Russell,et al.  Concept of Emotion Viewed From a Prototype Perspective , 1984 .

[13]  M. Minsky The Society of Mind , 1986 .

[14]  Cynthia Whissell,et al.  THE DICTIONARY OF AFFECT IN LANGUAGE , 1989 .

[15]  R. Thayer The biopsychology of mood and arousal , 1989 .

[16]  P. Ekman An argument for basic emotions , 1992 .

[17]  J. Avery,et al.  The long tail. , 1995, Journal of the Tennessee Medical Association.

[18]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[19]  R. Thayer The Origin of Everyday Moods: Managing Energy, Tension, and Stress , 1996 .

[20]  Adrian C. North,et al.  The social psychology of music. , 1997 .

[21]  D. Watson,et al.  On the Dimensional and Hierarchical Structure of Affect , 1999 .

[22]  Simon Dixon,et al.  A Lightweight Multi-agent Musical Beat Tracking System , 2000, PRICAI.

[23]  George Tzanetakis,et al.  MARSYAS: a framework for audio analysis , 1999, Organised Sound.

[24]  P. Juslin,et al.  Cue Utilization in Communication of Emotion in Music Performance: Relating Performance to Perception Studies of Music Performance , 2022 .

[25]  Frank R. Abate,et al.  The new Oxford American dictionary , 2001 .

[26]  J. Sloboda,et al.  Psychological perspectives on music and emotion , 2001 .

[27]  A. Gabrielsson,et al.  The influence of musical structure on emotional expression. , 2001 .

[28]  Xavier Rodet,et al.  Automatically selecting signal descriptors for SoundClassification , 2002, ICMC.

[29]  Erik T. Mueller,et al.  Open Mind Common Sense: Knowledge Acquisition from the General Public , 2002, OTM.

[30]  Zahir Tari,et al.  On the Move to Meaningful Internet Systems, 2002 - DOA/CoopIS/ODBASE 2002 Confederated International Conferences DOA, CoopIS and ODBASE 2002 Irvine, California, USA, October 30 - November 1, 2002, Proceedings , 2002, CoopIS/DOA/ODBASE.

[31]  Stephan Baumann,et al.  Super-convenience for Non-musicans: Querying MP3 and the Semantic Web , 2002, ISMIR.

[32]  Tong Zhang,et al.  Semi-automatic approach for music classification , 2003, SPIE ITCom.

[33]  Tao Li,et al.  Detecting emotion in music , 2003, ISMIR.

[34]  Emery Schubert Update of the Hevner Adjective Checklist , 2003, Perceptual and motor skills.

[35]  Henry Lieberman,et al.  A model of textual affect sensing using real-world knowledge , 2003, IUI '03.

[36]  Yueting Zhuang,et al.  Music information retrieval by detecting mood via computational media aesthetics , 2003, Proceedings IEEE/WIC International Conference on Web Intelligence (WI 2003).

[37]  Matt Jones,et al.  Organizing digital music for use: an examination of personal music collections , 2004, ISMIR.

[38]  Marc Leman,et al.  Using audio features to model the affective response to music , 2004 .

[39]  Dan Yang,et al.  Disambiguating Music Emotion Using Software Agents , 2004, ISMIR.

[40]  Tao Li,et al.  Content-based music similarity search and emotion detection , 2004, 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[41]  George Tzanetakis,et al.  Music analysis and retrieval systems for audio signals , 2004, J. Assoc. Inf. Sci. Technol..

[42]  Beth Logan,et al.  Semantic analysis of song lyrics , 2004, 2004 IEEE International Conference on Multimedia and Expo (ICME) (IEEE Cat. No.04TH8763).

[43]  Hugo Liu,et al.  ConceptNet — A Practical Commonsense Reasoning Tool-Kit , 2004 .

[44]  Emery Schubert Modeling Perceived Emotion With Continuous Musical Features , 2004 .

[45]  Zhang Naiyao,et al.  User-adaptive music emotion recognition , 2004, Proceedings 7th International Conference on Signal Processing, 2004. Proceedings. ICSP '04. 2004..

[46]  David García,et al.  The CLAM Annotator: A Cross-Platform Audio Descriptors Editing Tool , 2005, ISMIR.

[47]  T. Kemp,et al.  Mood-based navigation through large collections of musical data , 2005, Second IEEE Consumer Communications and Networking Conference, 2005. CCNC. 2005.

[48]  G. Widmer,et al.  EVALUATION OF FREQUENTLY USED AUDIO FEATURES FOR CLASSIFICATION OF MUSIC INTO PERCEPTUAL CATEGORIES , 2005 .

[49]  Xavier Amatriain,et al.  An Object-oriented metamodel for digital signal processing with a focus on audio and music , 2005 .

[50]  Piotr Synak,et al.  Extracting Emotions from Music Data , 2005, ISMIS.

[51]  Barry Vercoe,et al.  Learning the meaning of music , 2005 .

[52]  Janto Skowronek,et al.  Ground truth for automatic music mood classification , 2006, ISMIR.

[53]  G. Scott Vercoe Moodtrack : practical methods for assembling emotion-driven music , 2006 .

[54]  Sally Jo Cunningham,et al.  'More of an Art than a Science': Supporting the Creation of Playlists and Mixes , 2006, ISMIR.

[55]  Daniel P. W. Ellis,et al.  Support vector machine active learning for music retrieval , 2006, Multimedia Systems.

[56]  Lie Lu,et al.  Automatic mood detection and tracking of music audio signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[57]  Erik Duval,et al.  Moody Tunes: The Rockanango Project , 2006, ISMIR.

[58]  Andreja Andric,et al.  Automatic playlist generation based on tracking user’s listening habits , 2006, Multimedia Tools and Applications.