Emotion recognition based on 3D fuzzy visual and EEG features in movie clips

In this paper, we propose an emotion recognition system for understanding the emotional state of humans while watching a movie clip. Using various movies clips with scene changes over time, a 3D fuzzy GIST was used for dynamic emotional feature extraction from low-level visual features, and a 3D fuzzy tensor was used for semantic-level brain features related to the emotional state of humans. In the case of dynamic visual features, the 3D fuzzy GIST consists of [email protected][email protected][email protected]? color (L: Lightness; C: Chroma; and H: Hue) and orientation information of a movie clip in a predefined time interval. For dynamic brain features, we processed the electroencephalographic (EEG) signals, as stimulated by the movie clips to induce an emotional state, through both an independent component analysis (ICA) to eliminate artifacts, and Short Time Fourier Transform (STFT) to extract the reliable features. To obtain a hemisphere power asymmetry, the 3D tensor data for the brain signals were constructed according to the time-dependent energy at the alpha band (8-13Hz) and gamma band (30-60Hz). Finally, the 3D fuzzy GIST and 3D fuzzy tensor were obtained through fuzzy C-means clustering using visual and EEG signals, respectively. The obtained 3D fuzzy GIST and 3D fuzzy tensor features were used as inputs to an adaptive neuro-fuzzy inference (ANFIS) classifier, which was provided using the mean opinion scores (MOSs) as the teaching signals. Experimental results show that, using an ANFIS classifier, the proposed 3D fuzzy visual and EEG features are effective in building an emotion recognition system.

[1]  L. Ou,et al.  A study of colour emotion and colour preference. Part I: Colour emotions for single colours , 2004 .

[2]  Qing Zhang,et al.  Autonomous emotion development using incremental modified adaptive neuro-fuzzy inference system , 2012, Neurocomputing.

[3]  Zhang Jian-chao,et al.  A new SVM based emotional classification of image , 2005 .

[4]  Kiyoharu Aizawa,et al.  Affective Audio-Visual Words and Latent Topic Driving Model for Realizing Movie Affective Scene Classification , 2010, IEEE Transactions on Multimedia.

[5]  K. Scherer What are emotions? And how can they be measured? , 2005 .

[6]  Kazuhiko Takahashi,et al.  Remarks on emotion recognition from multi-modal bio-potential signals , 2004, 2004 IEEE International Conference on Industrial Technology, 2004. IEEE ICIT '04..

[7]  Alberto Del Bimbo,et al.  Taking into Consideration Sports Semantic Annotation of Sports Videos Content-based Multimedia Indexing and Retrieval , 2002 .

[8]  Nicolae Cristian Pampu,et al.  STUDY OF EFFECTS OF THE SHORT TIME FOURIER TRANSFORM CONFIGURATION ON EEG SPECTRAL ESTIMATES , 2011 .

[9]  T. Sejnowski,et al.  Removing electroencephalographic artifacts by blind source separation. , 2000, Psychophysiology.

[10]  Yu Ying-lin,et al.  Image Retrieval by Emotional Semantics: A Study of Emotional Space and Feature Extraction , 2006, 2006 IEEE International Conference on Systems, Man and Cybernetics.

[11]  Veit Schwämmle,et al.  BIOINFORMATICS ORIGINAL PAPER , 2022 .

[12]  Alberto Del Bimbo,et al.  Semantics in Visual Information Retrieval , 1999, IEEE Multim..

[13]  Ioannis Patras,et al.  Fusion of facial expressions and EEG for implicit affective tagging , 2013, Image Vis. Comput..

[14]  Masayuki Inoue,et al.  MIC Interactive Dance System-an emotional interaction system , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[15]  Minho Lee,et al.  3D Fuzzy GIST to Analyze Emotional Features in Movies , 2012, IDEAL.

[16]  Qing Zhang,et al.  A hierarchical positive and negative emotion understanding system based on integrated analysis of visual and brain signals , 2010, Neurocomputing.

[17]  J. Panksepp,et al.  Human brain EEG indices of emotions: Delineating responses to affective vocalizations by measuring frontal theta event-related synchronization , 2011, Neuroscience & Biobehavioral Reviews.

[18]  Yang Cao,et al.  Automatic Removal of Artifacts from EEG Data Using ICA and Exponential Analysis , 2006, ISNN.

[19]  Suh-Yin Lee,et al.  Emotion-based music recommendation by affinity discovery from film music , 2009, Expert Syst. Appl..

[20]  Arvid Kappas,et al.  Face-to-face communication over the Internet: emotions in a web of culture, language, and technology , 2011 .

[21]  David Sander,et al.  A systems approach to appraisal mechanisms in emotion , 2005, Neural Networks.

[22]  Giovanni Pilato,et al.  I Feel Blue: Robots and Humans Sharing Color Representation for Emotional Cognitive Interaction , 2012, BICA.

[23]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[24]  Didier Dubois,et al.  Possibility theory is not fully compositional! A comment on a short note by H.J. Greenberg , 1998, Fuzzy Sets Syst..

[25]  Li Xu,et al.  An emotion-based approach to decision making and self learning in autonomous robot control , 2004, Fifth World Congress on Intelligent Control and Automation (IEEE Cat. No.04EX788).

[26]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[27]  Jyh-Shing Roger Jang,et al.  ANFIS: adaptive-network-based fuzzy inference system , 1993, IEEE Trans. Syst. Man Cybern..

[28]  Loong Fah Cheong,et al.  Affective understanding in film , 2006, IEEE Trans. Circuits Syst. Video Technol..

[29]  Matthias M. Müller,et al.  Processing of affective pictures modulates right-hemispheric gamma band EEG activity , 1999, Clinical Neurophysiology.

[30]  Qing Zhang,et al.  Analyzing the dynamics of emotional scene sequence using recurrent neuro-fuzzy network , 2013, Cognitive Neurodynamics.

[31]  Minho Lee,et al.  Emotion Understanding in Movie Clips Based on EEG Signal Analysis , 2012, ICONIP.

[32]  Qing Zhang,et al.  Emotion development system by interacting with human EEG and natural scene understanding , 2012, Cognitive Systems Research.

[33]  D. Heeger,et al.  Reliability of cortical activity during natural stimulation , 2010, Trends in Cognitive Sciences.

[34]  Arnaud Delorme,et al.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis , 2004, Journal of Neuroscience Methods.

[35]  F. Hlawatsch,et al.  Linear and quadratic time-frequency signal representations , 1992, IEEE Signal Processing Magazine.

[36]  Ling-Yu Duan,et al.  Hierarchical movie affective content analysis based on arousal and valence features , 2008, ACM Multimedia.