REMoDNaV: robust eye-movement classification for dynamic stimulation.

Tracking of eye movements is an established measurement for many types of experimental paradigms. More complex and more prolonged visual stimuli have made algorithmic approaches to eye-movement event classification the most pragmatic option. A recent analysis revealed that many current algorithms are lackluster when it comes to data from viewing dynamic stimuli such as video sequences. Here we present an event classification algorithm-built on an existing velocity-based approach-that is suitable for both static and dynamic stimulation, and is capable of classifying saccades, post-saccadic oscillations, fixations, and smooth pursuit events. We validated classification performance and robustness on three public datasets: 1) manually annotated, trial-based gaze trajectories for viewing static images, moving dots, and short video sequences, 2) lab-quality gaze recordings for a feature-length movie, and 3) gaze recordings acquired under suboptimal lighting conditions inside the bore of a magnetic resonance imaging (MRI) scanner for the same full-length movie. We found that the proposed algorithm performs on par or better compared to state-of-the-art alternatives for static stimulation. Moreover, it yields eye-movement events with biologically plausible characteristics on prolonged dynamic recordings. Lastly, algorithm performance is robust on data acquired under suboptimal conditions that exhibit a temporally varying noise level. These results indicate that the proposed algorithm is a robust tool with improved classification accuracy across a range of use cases. The algorithm is cross-platform compatible, implemented using the Python programming language, and readily available as free and open-source software from public sources.

[1]  R. Gellman,et al.  Human smooth pursuit: stimulus-dependent responses. , 1987, Journal of neurophysiology.

[2]  John D. Hunter,et al.  Matplotlib: A 2D Graphics Environment , 2007, Computing in Science & Engineering.

[3]  Eric Jones,et al.  SciPy: Open Source Scientific Tools for Python , 2001 .

[4]  Marcus Nyström,et al.  Is human classification by experienced untrained observers a gold standard in fixation detection? , 2017, Behavior research methods.

[5]  P. Jaccard,et al.  Etude comparative de la distribution florale dans une portion des Alpes et des Jura , 1901 .

[6]  Marcus Nyström,et al.  One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms , 2016, Behavior Research Methods.

[7]  Dekel Abeles,et al.  Temporal dynamics of saccades explained by a self-paced process , 2017, Scientific Reports.

[8]  Scott P. Johnson,et al.  Gazepath: An eye-tracking analysis tool that accounts for individual differences and data quality , 2017, Behavior Research Methods.

[9]  Kenneth Holmqvist,et al.  gazeNet: End-to-end eye-movement event detection with deep neural networks , 2018, Behavior Research Methods.

[10]  Deborah E. Hannula,et al.  Worth a Glance: Using Eye Movements to Investigate the Cognitive Neuroscience of Memory , 2010, Front. Hum. Neurosci..

[11]  Satrajit S. Ghosh,et al.  The brain imaging data structure, a format for organizing and describing outputs of neuroimaging experiments , 2016, Scientific Data.

[12]  Marcus Nyström,et al.  Detection of fixations and smooth pursuit movements in high-speed eye-tracking data , 2015, Biomed. Signal Process. Control..

[13]  Oleg V. Komogortsev,et al.  Standardization of Automated Analyses of Oculomotor Fixation and Saccadic Behaviors , 2010, IEEE Transactions on Biomedical Engineering.

[14]  Skipper Seabold,et al.  Statsmodels: Econometric and Statistical Modeling with Python , 2010, SciPy.

[15]  Ingrid Heynderickx,et al.  Visual Attention in Objective Image Quality Assessment: Based on Eye-Tracking Data , 2011, IEEE Transactions on Circuits and Systems for Video Technology.

[16]  Timothy J. Andrews,et al.  Dynamic stimuli demonstrate a categorical representation of facial expression in the amygdala , 2014, Neuropsychologia.

[17]  Lee Friedman,et al.  A novel evaluation of two related and two independent algorithms for eye movement classification during reading , 2018, Behavior Research Methods.

[18]  Marcus Nyström,et al.  The pupil is faster than the corneal reflection (CR): Are video based pupil-CR eye trackers suitable for studying detailed dynamics of eye movements? , 2016, Vision Research.

[19]  Marcus Nyström,et al.  An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data , 2010, Behavior research methods.

[20]  H. Laufs,et al.  Decoding Wakefulness Levels from Typical fMRI Resting-State Data Reveals Reliable Drifts between Wakefulness and Sleep , 2014, Neuron.

[21]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[22]  Cristian Sminchisescu,et al.  Dynamic Eye Movement Datasets and Learnt Saliency Models for Visual Action Recognition , 2012, ECCV.

[23]  Alexander C. Schütz,et al.  Eye movements and perception: a selective review. , 2011, Journal of vision.

[24]  Marcus Nyström,et al.  Eye tracker data quality: what it is and how to measure it , 2012, ETRA.

[25]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[26]  M. Rucci,et al.  Precision of sustained fixation in trained and untrained observers. , 2012, Journal of vision.

[27]  Michael Hanke,et al.  A studyforrest extension, simultaneous fMRI and eye gaze recordings during prolonged natural stimulation , 2016, Scientific Data.

[28]  Ignace T. C. Hooge,et al.  Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers , 2018, Royal Society Open Science.

[29]  Vinoo Alluri,et al.  Capturing the musical brain with Lasso: Dynamic decoding of musical features from fMRI data , 2014, NeuroImage.

[30]  Tapio Takala,et al.  Enactive cinema paves way for understanding complex real-time social interaction in neuroimaging experiments , 2012, Front. Hum. Neurosci..

[31]  Nergiz Ercil Cagiltay,et al.  Evaluation of Ten Open-Source Eye-Movement Classification Algorithms in Simulated Surgical Scenarios , 2019, IEEE Access.

[32]  Eleanor A. Maguire,et al.  Studying the freely-behaving brain with fMRI , 2012, NeuroImage.

[33]  Wes McKinney,et al.  Data Structures for Statistical Computing in Python , 2010, SciPy.

[34]  Marcus Nyström,et al.  Detection of Saccades and Postsaccadic Oscillations in the Presence of Smooth Pursuit , 2013, IEEE Transactions on Biomedical Engineering.

[35]  Ioannis Agtzidis,et al.  1D CNN with BLSTM for automated classification of fixations, saccades, and smooth pursuits , 2018, Behavior Research Methods.

[36]  Randolph Blake,et al.  Pupil size dynamics during fixation impact the accuracy and precision of video-based gaze estimation , 2016, Vision Research.

[37]  Thomas Martinetz,et al.  Variability of eye movements when viewing dynamic natural scenes. , 2010, Journal of vision.

[38]  L. Stark,et al.  The main sequence, a tool for studying human eye movements , 1975 .

[39]  Dave M. Stampe,et al.  Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .

[40]  P. Gordon,et al.  Similarity-based interference during language comprehension: Evidence from eye tracking during reading. , 2006, Journal of experimental psychology. Learning, memory, and cognition.

[41]  Diederick C. Niehorster,et al.  Noise-robust fixation detection in eye movement data: Identification by two-means clustering (I2MC) , 2016, Behavior Research Methods.

[42]  Catherine Perrodin,et al.  Are We Ready for Real-world Neuroscience? , 2019, Journal of Cognitive Neuroscience.

[43]  E. Irving,et al.  Vertical Eye Position Control in Darkness: Orbital Position and Body Orientation Interact to Modulate Drift Velocity , 1997, Vision Research.

[44]  Oleg V Komogortsev,et al.  Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades , 2013, Behavior research methods.