Eyetracking-based assessment of affect-related decay of human performance in visual tasks

Abstract Computerized recognition of human emotions enriches the man–machine interaction with a new personal and behavioral aspect. Firstly, because the influence of emotions on the human performance can be objectively assessed and predicted using standardized stimuli, secondly, because an emotion-sensitive computer system can be programmed to react adequately. This paper presents an investigation of emotional influence on the human scanpath. The proposed method assumes that the emotional state of the observer can distract his or her visual attention and can be reliably expressed in parameters of eye movements. We performed visual task experiments in order to record scanpaths from volunteers under auditory stress of controlled intensity. We used a calibrated set of stimuli and controlled the response of volunteers’ central nervous system expressed by the heart rate. For each efficiently stimulated participant, we related the scanpath parameters to the accuracy of solving a given task and to participant’s comment. As a principal result, we obtained a significant change of (1) saccade frequency in 90% and (2) maximum velocity in fixation phase in 80% of participants under stress. These results prove the influence of emotions on visual acuity, feasibility of eyetracking-based assessment of emotional state and motivate further investigations.

[1]  Brian R. Baucom,et al.  Startle modulation before, during and after exposure to emotional stimuli. , 2002, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[2]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[3]  P. Laukka,et al.  A dimensional approach to vocal expression of emotion , 2005 .

[4]  Fabio Paternò,et al.  Speaker-independent emotion recognition exploiting a psychologically-inspired binary cascade classification schema , 2012, International Journal of Speech Technology.

[5]  E. Granholm,et al.  Pupillometric measures of cognitive and emotional processes. , 2004, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[6]  R. Tadeusiewicz,et al.  Assessment of Selected Parameters of the Automatic Scarification Device as an Example of a Device for Sustainable Forest Management , 2017 .

[7]  Peter J. Lang,et al.  Gaze Patterns When Looking at Emotional Pictures: Motivationally Biased Attention , 2004 .

[8]  Adam Pelikant,et al.  Comparison of perceptual features efficiency for automatic identification of emotional states from speech , 2013, 2013 6th International Conference on Human System Interactions (HSI).

[9]  Thomas Armstrong,et al.  Eye tracking of attention in the affective disorders: a meta-analytic review and synthesis. , 2012, Clinical psychology review.

[10]  J. Russell A circumplex model of affect. , 1980 .

[11]  Bao-Liang Lu,et al.  Multimodal emotion recognition using EEG and eye tracking data , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Murray Alpert,et al.  Emotion in Speech: The Acoustic Attributes of Fear, Anger, Sadness, and Joy , 1999, Journal of psycholinguistic research.

[13]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[14]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[15]  M. Bradley,et al.  Affective reactions to acoustic stimuli. , 2000, Psychophysiology.

[16]  Ramón López-Cózar,et al.  Enhancement of emotion detection in spoken dialogue systems by combining several information sources , 2011, Speech Commun..

[17]  Andrew Hollingworth,et al.  Eye Movements During Scene Viewing: An Overview , 1998 .

[18]  Marco Porta,et al.  Emotional e-learning through eye tracking , 2012, Proceedings of the 2012 IEEE Global Engineering Education Conference (EDUCON).

[19]  Shaila Apte,et al.  Emotion modeling from speech signal based on wavelet packet transform , 2013, Int. J. Speech Technol..

[20]  M. Murugappan,et al.  Classification of emotional states from electrocardiogram signals: a non-linear approach based on hurst , 2013, BioMedical Engineering OnLine.

[21]  Piotr Augustyniak,et al.  Distant Measurement of Plethysmographic Signal in Various Lighting Conditions Using Configurable Frame-Rate Camera , 2016 .

[22]  L M Optican,et al.  DYNAMICAL COMPLEXITY ANALYSIS OF SACCADIC EYE MOVEMENTS IN TWO DIFFERENT PSYCHOLOGICAL CONDITIONS. , 2014, Romanian reports in physics.

[23]  Shashidhar G. Koolagudi,et al.  Emotion recognition from speech: a review , 2012, International Journal of Speech Technology.

[24]  Scotty D. Craig,et al.  Affect and learning: An exploratory look into the role of affect in learning with AutoTutor , 2004 .

[25]  K. V. Krishna Kishore,et al.  Emotion recognition in speech using MFCC and wavelet features , 2013 .

[26]  A. L. Yarbus Eye Movements During Perception of Complex Objects , 1967 .

[27]  G R Loftus,et al.  Tachistoscopic simulations of eye fixations on pictures. , 1981, Journal of experimental psychology. Human learning and memory.