Will You Take This Turn? Gaze-Based Turning Activity Recognition During Navigation

Decision making is an integral part of wayfinding and people progressively use navigation systems to facilitate this task. The primary decision, which is also the main source of navigation error, is about the turning activity, i.e., to decide either to turn left or right or continue straight forward. The fundamental step to deal with this error, before applying any preventive approaches, e.g., providing more information, or any compensatory solutions, e.g., pre-calculating alternative routes, could be to predict and recognize the potential turning activity. This paper aims to address this step by predicting the turning decision of pedestrian wayfinders, before the actual action takes place, using primarily gaze-based features. Applying Machine Learning methods, the results of the presented experiment demonstrate an overall accuracy of 91% within three seconds before arriving at a decision point. Beyond the application perspective, our findings also shed light on the cognitive processes of decision making as reflected by the wayfinder’s gaze behaviour: incorporating environmental and user-related factors to the model, results in a noticeable change with respect to the importance of visual search features in turn activity recognition. 2012 ACM Subject Classification Computing methodologies → Activity recognition and understanding; Computing methodologies → Supervised learning by classification

[1]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Martin Raubal,et al.  Using eye movements to recognize activities on cartographic maps , 2013, SIGSPATIAL/GIS.

[3]  P Cavanagh,et al.  Familiarity and pop-out in visual search , 1994, Perception & psychophysics.

[4]  B. Tatler,et al.  Yarbus, eye movements, and vision , 2010, i-Perception.

[5]  Ekaterina P. Volkova,et al.  The effect of landmark and body-based sensory information on route knowledge , 2011, Memory & cognition.

[6]  S. Shimojo,et al.  Gaze bias both reflects and influences preference , 2003, Nature Neuroscience.

[7]  Andreas Dengel,et al.  Daily activity recognition combining gaze motion and visual features , 2014, UbiComp Adjunct.

[8]  Maria K. Eckstein,et al.  Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development? , 2016, Developmental Cognitive Neuroscience.

[9]  Nitesh V. Chawla,et al.  SMOTE: Synthetic Minority Over-sampling Technique , 2002, J. Artif. Intell. Res..

[10]  Simon J. Büchner,et al.  Gaze behaviour during space perception and spatial decision making , 2011, Psychological Research.

[11]  Kai-Florian Richter,et al.  How does navigation system behavior influence human behavior? , 2019, Cognitive Research: Principles and Implications.

[12]  F. Hermens,et al.  Eye Movements in Risky Choice , 2015, Journal of behavioral decision making.

[13]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[14]  Alexander Zipf,et al.  Towards a Landmark-Based Pedestrian Navigation Service Using OSM Data , 2017, ISPRS Int. J. Geo Inf..

[15]  Martin Raubal,et al.  Eye tracking for spatial research: Cognition, computation, challenges , 2017, Spatial Cogn. Comput..

[16]  Mary M Hayhoe,et al.  Task and context determine where you look. , 2016, Journal of vision.

[17]  Jacob L. Orquin,et al.  Attention and choice: a review on eye movements in decision making. , 2013, Acta psychologica.

[18]  William C. Halperin,et al.  EXPLORING SPATIAL FAMILIARITY , 1990 .

[19]  I. Giannopoulos,et al.  It’s also about timing! When do pedestrians want to receive navigation instructions , 2021, Spatial Cogn. Comput..

[20]  Alexandros Kafkas,et al.  Familiarity and recollection produce distinct eye movement, pupil and medial temporal lobe responses when memory strength is matched , 2012, Neuropsychologia.

[21]  Tad T. Brunyé,et al.  Spatial decision dynamics during wayfinding: intersections prompt the decision-making process , 2018, Cognitive Research: Principles and Implications.

[22]  Ruta Desai,et al.  Towards inferring cognitive state changes from pupil size variations in real world conditions , 2020, ETRA.

[23]  Scott Lundberg,et al.  A Unified Approach to Interpreting Model Predictions , 2017, NIPS.

[24]  E. Hess,et al.  Pupil Size as Related to Interest Value of Visual Stimuli , 1960, Science.

[25]  Bin Li,et al.  Accurate and efficient processor performance prediction via regression tree based modeling , 2009, J. Syst. Archit..

[26]  Martin Raubal,et al.  GazeNav: Gaze-Based Pedestrian Navigation , 2015, MobileHCI.

[27]  R. Pieters,et al.  Visual attention during brand choice : The impact of time pressure and task motivation , 1999 .

[28]  Ioannis Giannopoulos,et al.  Intersections of Our World , 2018, GIScience.

[29]  Rul von Stülpnagel,et al.  Gaze behavior during incidental and intentional navigation in an outdoor environment , 2017, Spatial Cogn. Comput..

[30]  Maria Vasardani,et al.  A proactive route planning approach to navigation errors , 2021, Int. J. Geogr. Inf. Sci..

[31]  Christoph Hölscher,et al.  The interplay of pedestrian navigation, wayfinding devices, and environmental features in indoor settings , 2016, ETRA.

[32]  Dominique Genoud,et al.  Decision Tree Ensemble Vs. N.N. Deep Learning: Efficiency Comparison For A Small Image Dataset , 2018, 2018 International Workshop on Big Data and Information Security (IWBIS).

[33]  Bing Liu,et al.  Differences in the Gaze Behaviours of Pedestrians Navigating between Regular and Irregular Road Patterns , 2020, ISPRS Int. J. Geo Inf..

[34]  Rul von Stülpnagel,et al.  Gaze behavior during urban cycling: Effects of subjective risk perception and vista space properties , 2020 .

[35]  Ingmar Nitze,et al.  COMPARISON OF MACHINE LEARNING ALGORITHMS RANDOM FOREST, ARTIFICIAL NEURAL NETWORK AND SUPPORT VECTOR MACHINE TO MAXIMUM LIKELIHOOD FOR SUPERVISED CROP TYPE CLASSIFICATION , 2012 .

[36]  Ioannis Giannopoulos,et al.  Supporting Wayfinding Through Mobile Gaze-Based Interaction , 2016 .

[37]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[38]  Markus Kattenbeck,et al.  Navigating Your Way! Increasing the Freedom of Choice During Wayfinding , 2021, GIScience.

[39]  Christoph Hölscher,et al.  Do you have to look where you go? Gaze behaviour during spatial decision making , 2011, CogSci.

[40]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[41]  R. Downs,et al.  Maps in minds : reflections on cognitive mapping , 1978 .

[42]  Martin Raubal,et al.  Wayfinding Decision Situations: A Conceptual Model and Evaluation , 2014, GIScience.

[43]  J. Edelman,et al.  The dependence of visual scanning performance on saccade, fixation, and perceptual metrics , 2008, Vision Research.

[44]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[45]  R. Levine,et al.  The Pace of Life in 31 Countries , 1999 .

[46]  R. Baddeley,et al.  The long and the short of it: Spatial statistics at fixation vary with saccade amplitude and task , 2006, Vision Research.