Relationships between EEGs and eye movements in response to facial expressions

To determine the relationship between brain activity and eye movements when activated by images of facial expressions, electroencephalograms (EEGs) and eye movements based on electrooculograms (EOGs) were measured and analyzed. Typical facial expressions from a photo database were grouped into two clusters by subjective evaluation and designated as either "Pleasant" or "Unpleasant" facial images. Regarding chronological analysis, the correlation coefficients of frequency powers between EEGs at a central area and eye movements monotonically increased throughout the time course when "Unpleasant" images were presented. Both the definite relationships and these dependencies on images of facial expressions were confirmed.

[1]  M. Eimer,et al.  Event-related brain potential correlates of emotional face processing , 2007, Neuropsychologia.

[2]  Minoru Nakayama,et al.  Estimation of viewer's response for contextual understanding of tasks using features of eye-movements , 2010, ETRA '10.

[3]  A. Gale,et al.  Eye Movement Strategies Involved in Face Perception , 1977, Perception.

[4]  Denis Fize,et al.  Speed of processing in the human visual system , 1996, Nature.

[5]  Minoru Nakayama,et al.  Frequency analysis of task evoked pupillary response and eye-movement , 2004, ETRA.

[6]  Minoru Nakayama,et al.  Features of Event-related Potentials Used to Recognize Clusters of Facial Expressions , 2015, BIOSIGNALS.

[7]  D. Stuss,et al.  Cognitive neuroscience. , 1993, Current opinion in neurobiology.

[8]  S. Thorpe,et al.  The Time Course of Visual Processing: From Early Perception to Decision-Making , 2001, Journal of Cognitive Neuroscience.

[9]  N. Suzuki,et al.  Robustness of the Two-Dimensional Structure of Recognition of Facial Expression: Evidence under Different Intensities of Emotionality , 2001, Perceptual and motor skills.

[10]  Minoru Nakayama,et al.  Development of a system usability assessment procedure using oculo-motors for input operation , 2011, Universal Access in the Information Society.

[11]  G. A. Mendelsohn,et al.  Affect grid : A single-item scale of pleasure and arousal , 1989 .

[12]  Kazuo Shigemasu,et al.  [A model of two-dimensional placement of the facial expressions of emotion]. , 2005, Shinrigaku kenkyu : The Japanese journal of psychology.

[13]  C. Joyce,et al.  Tracking eye fixations with electroocular and electroencephalographic recordings. , 2002, Psychophysiology.

[14]  L. Nummenmaa,et al.  Eye-movement assessment of the time course in facial expression recognition: Neurophysiological implications , 2009, Cognitive, affective & behavioral neuroscience.

[15]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.