Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening

Electroencephalography (EEG)-based emotion classification during music listening has gained increasing attention nowadays due to its promise of potential applications such as musical affective brain-computer interface (ABCI), neuromarketing, music therapy, and implicit multimedia tagging and triggering. However, music is an ecologically valid and complex stimulus that conveys certain emotions to listeners through compositions of musical elements. Using solely EEG signals to distinguish emotions remained challenging. This study aimed to assess the applicability of a multimodal approach by leveraging the EEG dynamics and acoustic characteristics of musical contents for the classification of emotional valence and arousal. To this end, this study adopted machine-learning methods to systematically elucidate the roles of the EEG and music modalities in the emotion modeling. The empirical results suggested that when whole-head EEG signals were available, the inclusion of musical contents did not improve the classification performance. The obtained performance of 74~76% using solely EEG modality was statistically comparable to that using the multimodality approach. However, if EEG dynamics were only available from a small set of electrodes (likely the case in real-life applications), the music modality would play a complementary role and augment the EEG results from around 61–67% in valence classification and from around 58–67% in arousal classification. The musical timber appeared to replace less-discriminative EEG features and led to improvements in both valence and arousal classification, whereas musical loudness was contributed specifically to the arousal classification. The present study not only provided principles for constructing an EEG-based multimodal approach, but also revealed the fundamental insights into the interplay of the brain activity and musical contents in emotion modeling.

[1]  J. Russell A circumplex model of affect. , 1980 .

[2]  R. Davidson Anterior cerebral asymmetry and the nature of emotion , 1992, Brain and Cognition.

[3]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  I. Peretz,et al.  Music and emotion: perceptual determinants, immediacy, and isolation after brain damage , 1998, Cognition.

[5]  Alan C. Evans,et al.  Emotional responses to pleasant and unpleasant music correlate with activity in paralimbic brain regions , 1999, Nature Neuroscience.

[6]  Densil Cabrera,et al.  ' PSYSOUND' : A COMPUTER PROGRAM FOR PSYCHOACOUSTICAL ANALYSIS , 1999 .

[7]  P. Juslin,et al.  Cue Utilization in Communication of Emotion in Music Performance: Relating Performance to Perception Studies of Music Performance , 2022 .

[8]  L. Aftanas,et al.  Event-Related Synchronization and Desynchronization During Affective Processing: Emergence of Valence-Related Time-Dependent Hemispheric Asymmetries in Theta and Upper Alpha Band , 2001, The International journal of neuroscience.

[9]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[10]  L J Trainor,et al.  Frontal EEG Responses as a Function of Affective Musical Features , 2001, Annals of the New York Academy of Sciences.

[11]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[12]  A. Gabrielsson Emotion perceived and emotion felt: Same or different? , 2001 .

[13]  D. Yao,et al.  A method to standardize a reference of scalp EEG recordings to a point at infinity , 2001, Physiological measurement.

[14]  E. Altenmüller,et al.  Hits to the left, flops to the right: different emotions during listening to music are reflected in cortical lateralisation patterns , 2002, Neuropsychologia.

[15]  George Tzanetakis,et al.  Musical genre classification of audio signals , 2002, IEEE Trans. Speech Audio Process..

[16]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[17]  O. John,et al.  Individual differences in two emotion regulation processes: implications for affect, relationships, and well-being. , 2003, Journal of personality and social psychology.

[18]  K. Takahashi,et al.  Remarks on emotion recognition from multi-modal bio-potential signals , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[19]  Masafumi Hagiwara,et al.  A feeling estimation system using a simple electroencephalograph , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[20]  P. Laukka,et al.  Communication of emotions in vocal expression and music performance: different channels, same code? , 2003, Psychological bulletin.

[21]  L. Aftanas,et al.  Analysis of Evoked EEG Synchronization and Desynchronization in Conditions of Emotional Activation in Humans: Temporal and Topographic Characteristics , 2004, Neuroscience and Behavioral Physiology.

[22]  Kazuhiko Takahashi Remarks on emotion recognition from multi-modal bio-potential signals , 2004 .

[23]  John J. B. Allen,et al.  Issues and assumptions on the road from raw signals to metrics of frontal EEG asymmetry in emotion , 2004, Biological Psychology.

[24]  Nicu Sebe,et al.  MULTIMODAL EMOTION RECOGNITION , 2005 .

[25]  C. Liégeois-Chauvel,et al.  Brain regions involved in the recognition of happiness and sadness in music , 2005, Neuroreport.

[26]  Vincent J. Schmithorst,et al.  Separate cortical networks involved in music perception: preliminary functional MRI evidence for modularity of music processing , 2005, NeuroImage.

[27]  Michela Sarlo,et al.  Changes in EEG alpha power to different disgust elicitors: the specificity of mutilations , 2005, Neuroscience Letters.

[28]  Chih-Jen Lin,et al.  Combining SVMs with Various Feature Selection Strategies , 2006, Feature Extraction.

[29]  K. MacDorman,et al.  Automatic Emotion Prediction of Song Excerpts: Index Construction, Algorithm Design, and Empirical Comparison , 2007 .

[30]  D. Sammler,et al.  Untangling syntactic and sensory processing: an ERP study of music perception. , 2007, Psychophysiology.

[31]  R. Zatorre,et al.  When the brain plays music: auditory–motor interactions in music perception and production , 2007, Nature Reviews Neuroscience.

[32]  M. Grigutsch,et al.  Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. , 2007, Psychophysiology.

[33]  Laura Chamberlain,et al.  What is "neuromarketing"? A discussion and agenda for future research. , 2007, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[34]  Petri Toiviainen,et al.  MIR in Matlab (II): A Toolbox for Musical Feature Extraction from Audio , 2007, ISMIR.

[35]  Guido Nolte,et al.  The use of standardized infinity reference in EEG coherency studies , 2007, NeuroImage.

[36]  P. Gomez,et al.  Relationships between musical structure and psychophysiological measures of emotion. , 2007, Emotion.

[37]  O. Lartillot,et al.  A MATLAB TOOLBOX FOR MUSICAL FEATURE EXTRACTION FROM AUDIO , 2007 .

[38]  A. Murat Tekalp,et al.  Audiovisual Synchronization and Fusion Using Canonical Correlation Analysis , 2007, IEEE Transactions on Multimedia.

[39]  Yuan-Pin Lin,et al.  Interactive content presentation based on expressed emotion and physiological feedback , 2008, ACM Multimedia.

[40]  Yi-Hsuan Yang,et al.  Toward Multi-modal Music Emotion Classification , 2008, PCM.

[41]  Yi-Hsuan Yang,et al.  A Regression Approach to Music Emotion Recognition , 2008, IEEE Transactions on Audio, Speech, and Language Processing.

[42]  T. Jung,et al.  EEG dynamics during music appreciation , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[43]  Michael H Thaut,et al.  Neurologic Music Therapy Improves Executive Function and Emotional Adjustment in Traumatic Brain Injury Rehabilitation , 2009, Annals of the New York Academy of Sciences.

[44]  Mohammad Soleymani,et al.  Short-term emotion assessment in a recall paradigm , 2009, Int. J. Hum. Comput. Stud..

[45]  Kjell Elenius,et al.  Emotion Recognition , 2009, Computers in the Human Interaction Loop.

[46]  Charalampos Bratsas,et al.  Toward Emotion Aware Computing: An Integrated Approach Using Multichannel Neurophysiological Recordings and Affective Visual Stimuli , 2010, IEEE Transactions on Information Technology in Biomedicine.

[47]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..

[48]  T. Jung,et al.  Electroencephalographic dynamics of musical emotion perception revealed by independent spectral components , 2010, Neuroreport.

[49]  Peng Xu,et al.  A comparative study of different references for EEG default mode network: The use of the infinity reference , 2010, Clinical Neurophysiology.

[50]  A. Gabrielsson,et al.  The role of structure in the musical expression of emotions , 2010 .

[51]  Leontios J. Hadjileontiadis,et al.  Emotion Recognition From EEG Using Higher Order Crossings , 2010, IEEE Transactions on Information Technology in Biomedicine.

[52]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[53]  Scott Makeig,et al.  First Demonstration of a Musical Emotion BCI , 2011, ACII.

[54]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[55]  J. Gross,et al.  Explicit and implicit emotion regulation: A dual-process framework , 2011, Cognition & emotion.

[56]  Homer H. Chen,et al.  Music Emotion Recognition , 2011 .

[57]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[58]  Christian Kothe,et al.  Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general , 2011, Journal of neural engineering.

[59]  Yuan-Pin Lin,et al.  Generalizations of the subject-independent feature set for music-induced emotion recognition , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[60]  Daniel P. W. Ellis,et al.  Signal Processing for Music Analysis , 2011, IEEE Journal of Selected Topics in Signal Processing.

[61]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[62]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[63]  Yi-Hsuan Yang,et al.  Machine Recognition of Music Emotion: A Review , 2012, TIST.

[64]  R. Parncutt,et al.  Consonance and dissonance in music theory and psychology : Disentangling dissonant dichotomies , 2012 .

[65]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[66]  Colin Grubb Multimodal Emotion Recognition , 2013 .

[67]  Ioannis Patras,et al.  Fusion of facial expressions and EEG for implicit affective tagging , 2013, Image Vis. Comput..

[68]  Remco C. Veltkamp,et al.  MIRUtrecht Participation in MediaEval 2013: Emotion in Music Task , 2013, MediaEval.