Feature Characteristics of ERPs and Eye Movements in response to Facial Expressions

The features of EEGs and eye movements were extracted to identify viewer’s emotional responses during the viewing of photos of facial expressions in order to understand the physiological reaction that occurs during the perception progress for emotions. Both EEGs and eye movements were measured using electrooculograms (EOGs). Facial expressions from a photo database were classified into two groups according to the viewer’s subjective evaluation of whether the facial images where “Pleasant” or “Unpleasant”. The group that the photos of facial expressions belonged to was predicted using the extracted features, and the prediction performance was analysed. A correlation analysis of the frequency powers extracted from EEGs and eye movements was also conducted, and the differences in relationships between the emotional categories was discussed. The results provide evidence of the chronological process during the perception of visual emotion and of the mutual EEGs and eye movement activity that these reactions produce. Received on 21 September 2018; accepted on 29 October 2018; published on 16 January 2019

[1]  S. Thorpe,et al.  Speed of processing in the human visual system , 1996, Nature.

[2]  M. Eimer,et al.  Event-related brain potential correlates of emotional face processing , 2007, Neuropsychologia.

[3]  Minoru Nakayama,et al.  Estimation of viewer's response for contextual understanding of tasks using features of eye-movements , 2010, ETRA '10.

[4]  N. Suzuki,et al.  Robustness of the Two-Dimensional Structure of Recognition of Facial Expression: Evidence under Different Intensities of Emotionality , 2001, Perceptual and motor skills.

[5]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[6]  C. Joyce,et al.  Tracking eye fixations with electroocular and electroencephalographic recordings. , 2002, Psychophysiology.

[7]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[8]  Minoru Nakayama,et al.  Frequency analysis of task evoked pupillary response and eye-movement , 2004, ETRA.

[9]  L. Nummenmaa,et al.  Eye-movement assessment of the time course in facial expression recognition: Neurophysiological implications , 2009, Cognitive, affective & behavioral neuroscience.

[10]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[11]  Kazuo Shigemasu,et al.  [A model of two-dimensional placement of the facial expressions of emotion]. , 2005, Shinrigaku kenkyu : The Japanese journal of psychology.

[12]  Minoru Nakayama,et al.  Development of a system usability assessment procedure using oculo-motors for input operation , 2011, Universal Access in the Information Society.

[13]  S. Thorpe,et al.  The Time Course of Visual Processing: From Early Perception to Decision-Making , 2001, Journal of Cognitive Neuroscience.

[14]  J M Findlay,et al.  Eye Movement Strategies Involved in Face Perception , 2013, Perception.

[15]  Kevin P. Murphy,et al.  Machine learning - a probabilistic perspective , 2012, Adaptive computation and machine learning series.

[16]  Minoru Nakayama,et al.  Relationships between EEGs and eye movements in response to facial expressions , 2016, ETRA.

[17]  Minoru Nakayama,et al.  Features of Event-related Potentials Used to Recognize Clusters of Facial Expressions , 2015, BIOSIGNALS.