Feel the Moosic: Emotion-based Music Selection and Recommendation

Digital transformation has changed all aspects of life, including the music market and listening habits. The spread of mobile devices and music streaming services has enabled the possibility to access a huge selection of music regardless of time or place. However, this access leads to the customer's problem of choosing the right music for a certain situation or mood. The user is often overwhelmed while choosing music. Context information, especially the emotional state of the user, can help within this process. The possibilities of an emotional music selection are currently limited. The providers rely on predefined playlists for different situations or moods. However, the problem with these lists is, that they do not adapt to new user conditions. A simple, intuitive and automatic emotion-based music selection has so far been poorly investigated in IS practice and research. This paper describes the IS music research project "Moosic", which investigates and iteratively implements an intuitive emotion-based music recommendation application. In addition, an initial evaluation of the prototype will be discussed and an outlook on further development will be given.

[1]  Janto Skowronek,et al.  Expressed music mood classification compared with valence and arousal ratings , 2012, EURASIP J. Audio Speech Music. Process..

[2]  Alex Pentland,et al.  Facial expression recognition using a dynamic model and motion energy , 1995, Proceedings of IEEE International Conference on Computer Vision.

[3]  Kin K. Leung,et al.  Context-Awareness for Mobile Sensing: A Survey and Future Directions , 2016, IEEE Communications Surveys & Tutorials.

[4]  Daniël Lakens,et al.  Using a Smartphone to Measure Heart Rate Changes during Relived Happiness and Anger , 2013, IEEE Transactions on Affective Computing.

[5]  Egon L. van den Broek,et al.  Tune in to your emotions: a robust personalized affective music player , 2012, User Modeling and User-Adapted Interaction.

[6]  Rahul Dubey,et al.  Emotion Analysis of Songs Based on Lyrical and Audio Features , 2015, ArXiv.

[7]  N. Scaringella,et al.  Automatic genre classification of music content: a survey , 2006, IEEE Signal Process. Mag..

[8]  Seungmin Rho,et al.  Music emotion classification and context-based music recommendation , 2010, Multimedia Tools and Applications.

[9]  J. Russell A circumplex model of affect. , 1980 .

[10]  John A. Sloboda,et al.  Music in everyday life: The role of emotions. , 2010 .

[11]  Zhouyu Fu,et al.  A Survey of Audio-Based Music Classification and Annotation , 2011, IEEE Transactions on Multimedia.

[12]  Bert Arnrich,et al.  Utilizing Emoticons on Mobile Devices within ESM studies to Measure Emotions in the Field , 2009 .

[13]  Astrid Weiss,et al.  Utilizing Emoticons on Mobile Devices within ESM studies to Measure Emotions in the Field , 2009 .

[14]  Francesco Ricci,et al.  Context-aware music recommender systems: workshop keynote abstract , 2012, WWW.

[15]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[16]  Mustafa E. Kamasak,et al.  Emotion Based Music Recommendation System Using Wearable Physiological Sensors , 2018, IEEE Transactions on Consumer Electronics.

[17]  E. Thayer Gaston,et al.  Dynamic Music Factors in Mood Change , 1951 .

[18]  E. Kensinger,et al.  Remembering Emotional Experiences: The Contribution of Valence and Arousal , 2004, Reviews in the neurosciences.

[19]  R. Thayer The biopsychology of mood and arousal , 1989 .

[20]  Jonathan Klein,et al.  Computers that recognise and respond to user emotion: theoretical and practical implications , 2002, Interact. Comput..

[21]  C. Nass,et al.  Emotion in human-computer interaction , 2002 .

[22]  G. Bieber,et al.  Emotion Recognition on the Go : Providing Personalized Services Based on Emotional States , 2009 .

[23]  Nakazawa Jin,et al.  MOLMOD: Analysis of Feelings based on Vital Information for Mood Acquisition , 2009 .

[24]  T. DeNora Music in Everyday Life , 2000 .

[25]  Seungjae Lee,et al.  Music mood classification model based on arousal-valence values , 2011, 13th International Conference on Advanced Communication Technology (ICACT2011).

[26]  Adrian C. North,et al.  Contextualized music listening: playlists and the Mehrabian and Russell model , 2014 .

[27]  Anja Bachmann,et al.  Leveraging smartwatches for unobtrusive mobile ambulatory mood assessment , 2015, UbiComp/ISWC Adjunct.

[28]  Elena Di Lascio,et al.  Unobtrusive Assessment of Students' Emotional Engagement during Lectures Using Electrodermal Activity Sensors , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[29]  T. Pettijohn,et al.  Music for the Seasons: Seasonal Music Preferences in College Students , 2010 .

[30]  M. Cabanac What is emotion? , 2002, Behavioural Processes.