Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram

Abstract Emotion recognition systems have been developed to assess human emotional states during different experiences. In this paper, an approach is proposed for recognizing music-induced emotions through the fusion of three-channel forehead biosignals (the left temporalis, frontalis, and right temporalis channels) and an electrocardiogram. The classification of four emotional states in an arousal–valence space (positive valence/low arousal, positive valence/high arousal, negative valence/high arousal, and negative valence/low arousal) was performed by employing two parallel support vector machines as arousal and valence classifiers. The inputs of the classifiers were obtained by applying a fuzzy-rough model feature evaluation criterion and sequential forward floating selection algorithm. An average classification accuracy of 88.78 % was achieved, corresponding to an average valence classification accuracy of 94.91 % and average arousal classification accuracy of 93.63 %. The proposed emotion recognition system may be useful for interactive multimedia applications or music therapy.

[1]  Leontios J. Hadjileontiadis,et al.  Emotion Recognition From EEG Using Higher Order Crossings , 2010, IEEE Transactions on Information Technology in Biomedicine.

[2]  S. Parvaneh,et al.  Application of novel mapping for heart rate phase space and its role in cardiac arrhythmia diagnosis , 2010, 2010 Computing in Cardiology.

[3]  N. Rickard,et al.  Relaxing music prevents stress-induced increases in subjective anxiety, systolic blood pressure, and heart rate in healthy males and females. , 2001, Journal of music therapy.

[4]  J. F. Kaiser,et al.  On a simple algorithm to calculate the 'energy' of a signal , 1990, International Conference on Acoustics, Speech, and Signal Processing.

[5]  E. Anna,et al.  Cognitive Behavioural Systems , 2012 .

[6]  Xiangyu Wang,et al.  Using affective human-machine interface to increase the operation performance in virtual construction crane training system: A novel approach , 2011 .

[7]  M. Bradley,et al.  Looking at pictures: affective, facial, visceral, and behavioral reactions. , 1993, Psychophysiology.

[8]  R. A. Mcfarland Relationship of skin temperature changes to the emotions accompanying music , 1985, Biofeedback and self-regulation.

[9]  Hiok Chai Quek,et al.  The dynamic emotion recognition system based on functional connectivity of brain regions , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[10]  I. Peretz,et al.  The Cognitive Neuroscience of Music , 2003 .

[11]  N. Pop-Jordanova,et al.  Spectrum-weighted EEG frequency ("brain-rate") as a quantitative indicator of mental arousal. , 2005, Prilozi.

[12]  Erik Cambria,et al.  The Hourglass of Emotions , 2011, COST 2102 Training School.

[13]  R. A. Pavlygina,et al.  Spectral Analysis of the Human EEG during Listening to Musical Compositions , 2004, Human Physiology.

[14]  Francesco Piazza,et al.  Sentic Web: A New Paradigm for Managing Social Media Affective Information , 2011, Cognitive Computation.

[15]  U. Rajendra Acharya,et al.  Heart rate variability: a review , 2006, Medical and Biological Engineering and Computing.

[16]  Iman Mohammad Rezazadeh,et al.  Discriminating affective states in music induction environment using forehead bioelectric signals , 2011, 2011 1st Middle East Conference on Biomedical Engineering.

[17]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[18]  Egon L. van den Broek,et al.  Personalized affective music player , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[19]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[20]  M. Grigutsch,et al.  Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. , 2007, Psychophysiology.

[21]  Huosheng Hu,et al.  A Human – Computer Interface based on Forehead Multi-Channel Bio-signals to Control a Virtual Wheelchair , 2002 .

[22]  B. Ripley,et al.  Pattern Recognition , 1968, Nature.

[23]  J. Marozeau,et al.  Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts , 2005 .

[24]  H. Schlosberg Three dimensions of emotion. , 1954, Psychological review.

[25]  David Aldridge,et al.  An overview of music therapy research , 1994 .

[26]  D. Västfjäll,et al.  Emotional responses to music: the need to consider underlying mechanisms. , 2008, The Behavioral and brain sciences.

[27]  Olga Sourina,et al.  Real-Time EEG-Based Human Emotion Recognition and Visualization , 2010, 2010 International Conference on Cyberworlds.

[28]  Qinghua Hu,et al.  Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation , 2007, Pattern Recognit..

[29]  L Bernardi,et al.  Cardiovascular, cerebrovascular, and respiratory changes induced by different types of music in musicians and non-musicians: the importance of silence , 2005, Heart.

[30]  J. Russell A circumplex model of affect. , 1980 .

[31]  Vladimir J. Konečni,et al.  Does music induce emotion? A theoretical and methodological analysis. , 2008 .

[32]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Laurel J. Trainor,et al.  Processing Emotions Induced by Music , 2003 .

[34]  Dipankar Das,et al.  Enhanced SenticNet with Affective Labels for Concept-Based Opinion Mining , 2013, IEEE Intelligent Systems.

[35]  F. Barrios,et al.  Metabolic and electric brain patterns during pleasant and unpleasant emotions induced by music masterpieces. , 2007, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.