Multimodal emotion recognition using EEG and eye tracking data

This paper presents a new emotion recognition method which combines electroencephalograph (EEG) signals and pupillary response collected from eye tracker. We select 15 emotional film clips of 3 categories (positive, neutral and negative). The EEG signals and eye tracking data of five participants are recorded, simultaneously, while watching these videos. We extract emotion-relevant features from EEG signals and eye tracing data of 12 experiments and build a fusion model to improve the performance of emotion recognition. The best average accuracies based on EEG signals and eye tracking data are 71.77% and 58.90%, respectively. We also achieve average accuracies of 73.59% and 72.98% for feature level fusion strategy and decision level fusion strategy, respectively. These results show that both feature level fusion and decision level fusion combining EEG signals and eye tracking data can improve the performance of emotion recognition model.

[1]  Yuan-Pin Lin,et al.  EEG-based emotion recognition in music listening: A comparison of schemes for multiclass support vector machine , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[2]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[3]  E. Granholm,et al.  Pupillometric measures of cognitive and emotional processes. , 2004, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[4]  H. Critchley,et al.  Pupillary contagion: central mechanisms engaged in sadness processing. , 2006, Social cognitive and affective neuroscience.

[5]  M. Bradley,et al.  The pupil as a measure of emotional arousal and autonomic activation. , 2008, Psychophysiology.

[6]  Bao-Liang Lu,et al.  Differential entropy feature for EEG-based emotion classification , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[7]  Yong Peng,et al.  EEG-based emotion classification using deep belief networks , 2014, 2014 IEEE International Conference on Multimedia and Expo (ICME).

[8]  Veikko Surakka,et al.  Pupillary responses to emotionally provocative stimuli , 2000, ETRA.

[9]  Bao-Liang Lu,et al.  EEG-based emotion recognition during watching movies , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[10]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[11]  Bao-Liang Lu,et al.  Differential entropy feature for EEG-based vigilance estimation , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).