Fusion of EEG and Musical Features in Continuous Music-emotion Recognition

Emotion estimation in music listening is confronting challenges to capture the emotion variation of listeners. Recent years have witnessed attempts to exploit multimodality fusing information from musical contents and physiological signals captured from listeners to improve the performance of emotion recognition. In this paper, we present a study of fusion of signals of electroencephalogram (EEG), a tool to capture brainwaves at a high-temporal resolution, and musical features at decision level in recognizing the time-varying binary classes of arousal and valence. Our empirical results showed that the fusion could outperform the performance of emotion recognition using only EEG modality that was suffered from inter-subject variability, and this suggested the promise of multimodal fusion in improving the accuracy of music-emotion recognition.

[1]  Sidney K. D'Mello,et al.  A Review and Meta-Analysis of Multimodal Affect Detection Systems , 2015, ACM Comput. Surv..

[2]  Gyanendra K. Verma,et al.  Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals , 2014, NeuroImage.

[3]  Olga Sourina,et al.  Real-time EEG-based emotion recognition for music therapy , 2011, Journal on Multimodal User Interfaces.

[4]  C. Krumhansl An exploratory study of musical emotions and psychophysiology. , 1997, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[5]  J. Russell A circumplex model of affect. , 1980 .

[6]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  François Pachet,et al.  Improving Multilabel Analysis of Music Titles: A Large-Scale Validation of the Correction Approach , 2009, IEEE Transactions on Audio, Speech, and Language Processing.

[8]  Ioannis Patras,et al.  Fusion of facial expressions and EEG for implicit affective tagging , 2013, Image Vis. Comput..

[9]  S. Koelsch Brain correlates of music-evoked emotions , 2014, Nature Reviews Neuroscience.

[10]  Miyoung Kim,et al.  A Review on the Computational Methods for Emotional State Estimation from the Human EEG , 2013, Comput. Math. Methods Medicine.

[11]  T. Jung,et al.  Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening , 2014, Front. Neurosci..

[12]  Arnaud Delorme,et al.  EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing , 2011, Comput. Intell. Neurosci..

[13]  Mohammad Soleymani,et al.  Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection , 2016, IEEE Transactions on Affective Computing.

[14]  T. Higuchi Approach to an irregular time series on the basis of the fractal theory , 1988 .

[15]  Yi-Hsuan Yang,et al.  Machine Recognition of Music Emotion: A Review , 2012, TIST.

[16]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[17]  Björn W. Schuller,et al.  Categorical and dimensional affect analysis in continuous input: Current trends and future directions , 2013, Image Vis. Comput..

[18]  B. Matthews Comparison of the predicted and observed secondary structure of T4 phage lysozyme. , 1975, Biochimica et biophysica acta.

[19]  Masayuki Numao,et al.  Continuous Music-Emotion Recognition Based on Electroencephalogram , 2016, IEICE Trans. Inf. Syst..