Do Online Social Tags Predict Perceived or Induced Emotional Responses to Music?

Music provides a powerful means of communication and self-expression. A wealth of research has been performed on the study of music and emotion, including emotion modelling and emotion classification. The emergence of online social tags (OST) has provided highly relevant information for the study of mood, as well as an important impetus for using discrete emotion terms in the study of continuous models of affect. Yet, the extent to which human annotation reveals either perceived emotion or induced emotion remains unknown. 80 musical excerpts were randomly selected from a collection of 2904 songs labelled with the Last.fm tags “happy”, “sad”, “angry” and “relax”. Forty-seven participants provided emotion ratings on the two continuous dimensions of valence and arousal for both perceived and induced emotion. Analysis of variance did not reveal significant differences in ratings between perceived emotion and induced emotion. Moreover, the results indicated that, regardless of the discrete type of emotion experienced, listeners’ ratings of perceived and induced emotion were highly positively correlated. Finally, the emotion tags “happy”, “sad” and “angry” but not “relax” predicted the corresponding experimentally provided emotion categories.

[1]  K. Scherer,et al.  Emotions evoked by the sound of music: characterization, classification, and measurement. , 2008, Emotion.

[2]  P. Ekman An argument for basic emotions , 1992 .

[3]  S. Gosling,et al.  PERSONALITY PROCESSES AND INDIVIDUAL DIFFERENCES The Do Re Mi’s of Everyday Life: The Structure and Personality Correlates of Music Preferences , 2003 .

[4]  Yading Song,et al.  Evaluation of Musical Features for Emotion Classification , 2012, ISMIR.

[5]  J. Russell A circumplex model of affect. , 1980 .

[6]  J. Panksepp Affective Neuroscience: The Foundations of Human and Animal Emotions , 1998 .

[7]  Perry R. Cook,et al.  Easy As CBA: A Simple Probabilistic Model for Tagging Music , 2009, ISMIR.

[8]  Tuomas Eerola,et al.  A Review of Music and Emotion Studies: Approaches, Emotion Models, and Stimuli , 2013 .

[9]  I. Peretz,et al.  A developmental study of the affective value of tempo and mode in music , 2001, Cognition.

[10]  Yong Yu,et al.  Exploring social annotations for the semantic web , 2006, WWW '06.

[11]  P. Laukka,et al.  Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening , 2004 .

[12]  A. Gabrielsson Emotion perceived and emotion felt: Same or different? , 2001 .

[13]  Paul Lamere,et al.  Social Tagging and Music Information Retrieval , 2008 .

[14]  K. Kallinen,et al.  Emotion perceived and emotion felt: Same and different , 2006 .

[15]  Mark B. Sandler,et al.  Music Information Retrieval Using Social Tags and Audio , 2009, IEEE Transactions on Multimedia.

[16]  Mor Naaman,et al.  Why we tag: motivations for annotation in mobile and online media , 2007, CHI.

[17]  U. Ott,et al.  Using music to induce emotions: Influences of musical preference and absorption , 2008 .

[18]  I. Peretz,et al.  Happy, sad, scary and peaceful musical excerpts for research on emotions , 2008 .

[19]  Y. Song,et al.  Using Tags to Select Stimuli in the Study of Music and Emotion , 2013 .

[20]  Norma Welch,et al.  The World-Wide Web as a medium for psychoacoustical demonstrations and experiments: Experience and results , 1996 .

[21]  Gert R. G. Lanckriet,et al.  Five Approaches to Collecting Tags for Music , 2008, ISMIR.

[22]  M. Terwogt,et al.  Musical Expression of Moodstates , 1991 .

[23]  Mark B. Sandler,et al.  A Semantic Space for Music Derived from Social Tags , 2007, ISMIR.

[24]  T. Eerola,et al.  A comparison of the discrete and dimensional models of emotion in music , 2011 .

[25]  Emery Schubert,et al.  Relationships between expressed and felt emotions in music , 2008 .

[26]  Tao Li,et al.  Are Tags Better Than Audio? The Effect of Joint Use of Tags and Audio Content Features for Artistic Style Clustering , 2010, ISMIR.

[27]  Thierry Bertin-Mahieux,et al.  Automatic Generation of Social Tags for Music Recommendation , 2007, NIPS.