Do you see what I see? Mobile eye-tracker contextual analysis and inter-rater reliability

Mobile eye-trackers are currently used during real-world tasks (e.g. gait) to monitor visual and cognitive processes, particularly in ageing and Parkinson’s disease (PD). However, contextual analysis involving fixation locations during such tasks is rarely performed due to its complexity. This study adapted a validated algorithm and developed a classification method to semi-automate contextual analysis of mobile eye-tracking data. We further assessed inter-rater reliability of the proposed classification method. A mobile eye-tracker recorded eye-movements during walking in five healthy older adult controls (HC) and five people with PD. Fixations were identified using a previously validated algorithm, which was adapted to provide still images of fixation locations (n = 116). The fixation location was manually identified by two raters (DH, JN), who classified the locations. Cohen’s kappa correlation coefficients determined the inter-rater reliability. The algorithm successfully provided still images for each fixation, allowing manual contextual analysis to be performed. The inter-rater reliability for classifying the fixation location was high for both PD (kappa = 0.80, 95% agreement) and HC groups (kappa = 0.80, 91% agreement), which indicated a reliable classification method. This study developed a reliable semi-automated contextual analysis method for gait studies in HC and PD. Future studies could adapt this methodology for various gait-related eye-tracking studies.

[1]  Nadir Weibel,et al.  Let's look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck , 2012, ETRA.

[2]  Alexander V. Zhegallo,et al.  ETRAN—R Extension Package for Eye Tracking Results Analysis , 2015, Perception.

[3]  Susanne Boll,et al.  Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications , 2019, AutomotiveUI.

[4]  Lynn Rochester,et al.  Visual sampling during walking in people with Parkinson’s disease and the influence of environment and dual-task , 2012, Brain Research.

[5]  Klaus Bengler,et al.  Implementing gaze control for peripheral devices , 2011, PETMEI '11.

[6]  Karen M. Evans,et al.  Analyzing complex gaze behavior in the natural world , 2011, Electronic Imaging.

[7]  K. Davids,et al.  Examination of gaze behaviors under in situ and video simulation task constraints reveals differences in information pickup for perception and action , 2010, Attention, perception & psychophysics.

[8]  J. Pelz,et al.  Oculomotor behavior and perceptual strategies in complex tasks , 2001, Vision Research.

[9]  Stijn De Beugher,et al.  Automatic analysis of in-the-wild mobile eye-tracking experiments using object, face and person detection , 2014, 2014 International Conference on Computer Vision Theory and Applications (VISAPP).

[10]  Mária Bieliková,et al.  User's Interest Detection through Eye Tracking for Related Documents Retrieval , 2014, 2014 9th International Workshop on Semantic and Social Media Adaptation and Personalization.

[11]  Wilson S. Geisler,et al.  Task dependence of visual attention on compressed videos: point of gaze statistics and analysis , 2011, Electronic Imaging.

[12]  Samuel Stuart,et al.  The measurement of visual sampling during real-world activity in Parkinson's disease and healthy controls: A structured literature review , 2014, Journal of Neuroscience Methods.

[13]  Urs P Mosimann,et al.  Visual exploration in Parkinson's disease and Parkinson's disease dementia. , 2013, Brain : a journal of neurology.

[14]  Jeff B. Pelz,et al.  Portable eyetracking: a study of natural eye movements , 2000, Electronic Imaging.

[15]  Michael Stein,et al.  Tracking Visual Scanning Techniques in Training Simulation for Helicopter Landing , 2013 .

[16]  Stephanie Cacioppo,et al.  Love Is in the Gaze , 2014, Psychological science.

[17]  Renato Moraes,et al.  Synchrony of gaze and stepping patterns in people with Parkinson’s disease , 2016, Behavioural Brain Research.

[18]  A Godfrey,et al.  Accuracy and re-test reliability of mobile eye-tracking in Parkinson's disease and older adults. , 2016, Medical engineering & physics.

[19]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[20]  Alan Godfrey,et al.  Quantifying saccades while walking: Validity of a novel velocity-based algorithm for mobile eye tracking , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[21]  James M. Rehg,et al.  Detecting eye contact using wearable eye-tracking glasses , 2012, UbiComp.

[22]  Quincy J. Almeida,et al.  Freezing of Gait in Parkinson’s Disease: An Overload Problem? , 2015, PloS one.

[23]  Q. Almeida,et al.  Visual cues and gait improvement in Parkinson’s disease: Which piece of information is really important? , 2014, Neuroscience.

[24]  Stijn De Beugher,et al.  Object recognition and person detection for mobile eye-tracking research: A case study with real-life customer journeys , 2013 .

[25]  Samuel Stuart,et al.  Gait in Parkinson’s disease: A visuo-cognitive challenge , 2016, Neuroscience & Biobehavioral Reviews.

[26]  Dinei A. F. Florêncio,et al.  Region of interest determination using human computation , 2011, 2011 IEEE 13th International Workshop on Multimedia Signal Processing.

[27]  Stephen Monsell,et al.  More attention to attention? An eye-tracking investigation of selection of perceptual attributes during a task switch. , 2013, Journal of experimental psychology. Learning, memory, and cognition.

[28]  Samuel Stuart,et al.  A protocol to examine vision and gait in Parkinson’s disease: impact of cognition and response to visual cues , 2015, F1000Research.

[29]  I. S. Mackenzie,et al.  Virtual Environments and Advanced Interface Design , 1995 .