Music Mood Classification Based on Lifelog

Music mood analysis is crucial for music applications that involve search and recommendation. At present, the classification of music mood mainly depends on manual annotation or music information from audio and lyrics. However, manual annotation requires a large number of users, and the process of acquiring and processing audio or lyric information is complicated. Therefore, we need a new and simple method to analyze music mood. As music mood itself is specific psychological feelings caused by various music elements acting together on the user’s hearing, in this paper we try to use information related to users, like their physiology and activities information. The development of wearable devices provides us with the opportunity to record user’s lifelog. The experiment results suggest that the classification method based on user information can effectively identify music mood. By integrating with the classification method based on music information, the recognition effect can be further improved.

[1]  T. Choudhury,et al.  Mood Based Classification of Music by Analyzing Lyrical Data Using Text Mining , 2016, 2016 International Conference on Micro-Electronics and Telecommunication Engineering (ICMETE).

[2]  Teruhisa Hochin,et al.  Emotional video scene retrieval system for lifelog videos based on facial expression intensity , 2017, 2017 18th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD).

[3]  Junjie Bai,et al.  Music emotions recognition by cognitive classification methodologies , 2017, 2017 IEEE 16th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC).

[4]  Dongkyoo Shin,et al.  Personal search system based on android using lifelog and machine learning , 2018, Personal and Ubiquitous Computing.

[5]  J. Russell A circumplex model of affect. , 1980 .

[6]  J. Stephen Downie,et al.  Music subject classification based on lyrics and user interpretations , 2016, ASIST.

[7]  Feng Su,et al.  Multimodal Music Mood Classification by Fusion of Audio and Lyrics , 2015, MMM.

[8]  Qianqian Wang,et al.  Automatic music mood classification by learning cross-media relevance between audio and lyrics , 2017, 2017 IEEE International Conference on Multimedia and Expo (ICME).

[9]  Sean A Munson,et al.  More Than Telemonitoring: Health Provider Use and Nonuse of Life-Log Data in Irritable Bowel Syndrome and Weight Management , 2015, Journal of medical Internet research.

[10]  Ning Han,et al.  Music Mood Classification via Deep Belief Network , 2015, 2015 IEEE International Conference on Data Mining Workshop (ICDMW).

[11]  Humberto Corona,et al.  An Exploration of Mood Classification in the Million Songs Dataset , 2015 .

[12]  Róisín Vahey,et al.  Galvanic Skin Response in Mood Disorders: A Critical Review , 2015 .

[13]  Deokjai Choi,et al.  Modeling and discovering human behavior from smartphone sensing life-log data for identification purpose , 2015, Human-centric Computing and Information Sciences.

[14]  Liadh Kelly,et al.  Multiple Multimodal Mobile Devices: Lessons Learned from Engineering Lifelog Solutions , 2012 .

[15]  P Anastasiades,et al.  The relationship between heart rate and mood in real life. , 1990, Journal of psychosomatic research.

[16]  Alan F. Smeaton,et al.  Challenges and Opportunities of Lifelog Technologies: A Literature Review and Critical Analysis , 2014, Sci. Eng. Ethics.

[17]  Edward Y. Chang,et al.  Optimal multimodal fusion for multimedia data analysis , 2004, MULTIMEDIA '04.

[18]  Robert Stewart,et al.  Depressed mood and blood pressure: the moderating effect of situation-specific arousal levels. , 2012, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.