Toward E-Motion-Based Music Retrieval a Study of Affective Gesture Recognition

The widespread availability of digitized music collections and mobile music players have enabled us to listen to music during many of our daily activities, such as physical exercise, commuting, relaxation, and many people enjoy this. A practical problem that comes along with the wish to listen to music is that of music retrieval, the selection of desired music from a music collection. In this paper, we propose a new approach to facilitate music retrieval. Modern smart phones are commonly used as music players and are already equipped with inertial sensors that are suitable for obtaining motion information. In the proposed approach, emotion is derived automatically from arm gestures and is used to query a music collection. We derive predictive models for valence and arousal from empirical data, gathered in an experimental setup where inertial data recorded from arm movements are coupled to musical emotion. Part of the experiment is a preliminary study confirming that human subjects are generally capable of recognizing affect from arm gestures. Model validation in the main study confirmed the predictive capabilities of the models.

[1]  Marcelo M. Wanderley,et al.  Performance Gestures of Musicians: What Structural and Emotional Information Do They Convey? , 2003, Gesture Workshop.

[2]  Alexander J. Smola,et al.  Support Vector Method for Function Approximation, Regression Estimation and Signal Processing , 1996, NIPS.

[3]  M. Buonocore,et al.  Posterior cingulate cortex activation by emotional words: fMRI evidence from a valence decision task , 2003, Human brain mapping.

[4]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[5]  A. Friberg,et al.  Visual Perception of Expressiveness in Musicians' Body Movements , 2007 .

[6]  D. Västfjäll,et al.  Emotional responses to music: the need to consider underlying mechanisms. , 2008, The Behavioral and brain sciences.

[7]  Marc Leman,et al.  How potential users of music search and retrieval systems describe the semantic quality of music , 2008 .

[8]  J. Panksepp,et al.  Emotional sounds and the brain: the neuro-affective foundations of musical appreciation , 2002, Behavioural Processes.

[9]  Emery Schubert Modeling Perceived Emotion With Continuous Musical Features , 2004 .

[10]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[11]  J. Marozeau,et al.  Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts , 2005 .

[12]  Yi-Hsuan Yang,et al.  Mr. Emo: music retrieval in the emotion plane , 2008, ACM Multimedia.

[13]  Antonio Camurri,et al.  A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media , 2010, IEEE Transactions on Multimedia.

[14]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[15]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[16]  VerstraetenD.,et al.  2007 Special Issue , 2007 .

[17]  Johannes Wagner,et al.  Smart sensor integration: A framework for multimodal emotion recognition in real-time , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[18]  Timothy L Hubbard,et al.  Auditory imagery: empirical findings. , 2010, Psychological bulletin.

[19]  Claire L. Roether,et al.  Critical features for the perception of emotion from gait. , 2009, Journal of vision.

[20]  K. Scherer,et al.  Emotions evoked by the sound of music: characterization, classification, and measurement. , 2008, Emotion.

[21]  Andrew R. Brown,et al.  Controlling musical emotionality: an affective computational architecture for influencing musical emotions , 2007, Digit. Creativity.

[22]  U. Ott,et al.  Using music to induce emotions: Influences of musical preference and absorption , 2008 .

[23]  T. Eerola,et al.  A comparison of the discrete and dimensional models of emotion in music , 2011 .

[24]  S. Brownlow,et al.  Perception of movement and dancer characteristics from point-light displays of dance , 1997 .

[25]  A. Young,et al.  Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays , 2004, Perception.

[26]  Nirbhay N. Singh,et al.  Facial Expressions of Emotion , 1998 .

[27]  Vladimir J. Konečni,et al.  Comparative effects of music and recalled life-events on emotional state , 2008 .

[28]  Robert J. Zatorre,et al.  Mental Concerts: Musical Imagery and Auditory Cortex , 2005, Neuron.

[29]  J. Russell A circumplex model of affect. , 1980 .

[30]  Lie Lu,et al.  Automatic mood detection and tracking of music audio signals , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[31]  Micheline Lesaffre,et al.  {Music Information Retrieval - Conceptual Framework, Annotation and User Behaviour} , 2005 .

[32]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[33]  M. Leman Embodied Music Cognition and Mediation Technology , 2007 .

[34]  Petri Toiviainen,et al.  Prediction of Multidimensional Emotional Ratings in Music from Audio Using Multivariate Regression Models , 2009, ISMIR.

[35]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[36]  P. Janata,et al.  Embodied music cognition and mediation technology , 2009 .