Introducing a Dataset of Emotional and Color Responses to Music

The paper presents a new dataset of mood-dependent and color responses to music. The methodology of gath-ering user responses is described along with two new inter-faces for capturing emotional states: the MoodGraph and MoodStripe. An evaluation study showed both inter-faces have significant advantage over more traditional methods in terms of intuitiveness, usability and time complexity. The preliminary analysis of current data (over 6.000 responses) gives an interesting insight into participants’ emotional states and color associations, as well as relationships between musically perceived and induced emotions. We believe the size of the dataset, in-terfaces and multi-modal approach (connecting emo-tional, visual and auditory aspects of human perception) give a valuable contribution to current research.

[1]  Nicola Orio,et al.  Music Retrieval: A Tutorial and Review , 2006, Found. Trends Inf. Retr..

[2]  Gregor Strle,et al.  Gathering a Dataset of Multi-Modal Mood-Dependent Perceptual Responses to Music , 2014, UMAP Workshops.

[3]  Jeffrey J. Scott,et al.  State of the Art Report: Music Emotion Recognition: A State of the Art Review , 2010, ISMIR.

[4]  Emery Schubert Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space , 1999 .

[5]  Tuomas Eerola,et al.  A Review of Music and Emotion Studies: Approaches, Emotion Models, and Stimuli , 2013 .

[6]  T. Eerola,et al.  A comparison of the discrete and dimensional models of emotion in music , 2011 .

[7]  L. Ou,et al.  A study of colour emotion and colour preference. Part I: Colour emotions for single colours , 2004 .

[8]  Juan Pablo Bello,et al.  Audio-Based Music Visualization For Music Structure Analysis , 2010 .

[9]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[10]  N. Haslam The Discreteness of Emotion Concepts: Categorical Structure in the Affective Circumplex , 1995 .

[11]  Gregor Strle,et al.  Capturing the mood: Evaluation of the moodstripe and moodgraph interfaces , 2014, 2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[12]  T. Dalgleish Basic Emotions , 2004 .

[13]  Hirokazu Kameoka,et al.  Specmurt anasylis: a piano-roll-visualization of polyphonic music signal by deconvolution of log-frequency spectrum , 2004, SAPA@INTERSPEECH.

[14]  P. Laukka,et al.  Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening , 2004 .

[15]  A. Gabrielsson Emotion perceived and emotion felt: Same or different? , 2001 .

[16]  K. Kallinen,et al.  Emotion perceived and emotion felt: Same and different , 2006 .

[17]  Emery Schubert,et al.  Relationships between expressed and felt emotions in music , 2008 .

[18]  Y. Song,et al.  A Survey of Music Recommendation Systems and Future Perspectives , 2012 .

[19]  Lois B. Wexner The degree to which colors (hues) are associated with mood-tones. , 1954 .

[20]  N. Remmington,et al.  Reexamining the circumplex model of affect. , 2000, Journal of personality and social psychology.

[21]  P. Valdez,et al.  Effects of color on emotions. , 1994, Journal of experimental psychology. General.

[22]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[23]  Markus Schedl,et al.  The neglected user in music information retrieval research , 2013, Journal of Intelligent Information Systems.

[24]  J. Sloboda,et al.  Music and emotion: Theory and research , 2001 .

[25]  Bin Wu,et al.  Spectral Correlates in Emotion Labeling of Sustained Musical Instrument Tones , 2013, ISMIR.

[26]  W. A. Scott,et al.  Color and Affect: A Review and Semantic Evaluation , 1952 .

[27]  Gert R. G. Lanckriet,et al.  Semantic Annotation and Retrieval of Music and Sound Effects , 2008, IEEE Transactions on Audio, Speech, and Language Processing.

[28]  Youngmoo E. Kim,et al.  Modeling Musical Emotion Dynamics with Conditional Random Fields , 2011, ISMIR.

[29]  J. Russell A circumplex model of affect. , 1980 .

[30]  Xavier Serra,et al.  Indexing music by mood: design and integration of an automatic content-based annotator , 2010, Multimedia Tools and Applications.

[31]  Björn Schuller,et al.  ‘Mister D.J., Cheer Me Up!’: Musical and Textual Features for Automatic Mood Classification , 2010 .

[32]  Yading Song,et al.  Do Online Social Tags Predict Perceived or Induced Emotional Responses to Music? , 2013, ISMIR.