Towards capturing focal/ambient attention during dynamic wayfinding

This work-in-progress paper reports on an ongoing experiment in which mobile eye-tracking is used to evaluate different wayfinding support systems. Specifically, it tackles the problem of detecting and isolating attentional demands of building layouts and signage systems in wayfinding tasks. The coefficient K has been previously established as a measure of focal/ambient attention for eye-tracking data. Here, we propose a novel method to compute coefficient K using eye-tracking from virtual reality experiments. We detail challenges associated with transforming a two-dimensional coefficient K concept to three-dimensional data, and the debatable theoretical equivalence of the concept after such a transformation. We present a preliminary implementation to experimental data and explore the possibilities of the method for novel insight in architectural analyses.

[1]  Arzu Çöltekin,et al.  Using Coefficient to Distinguish Ambient/Focal Visual Attention During Cartographic Tasks , 2017, Journal of eye movement research.

[2]  Christoph Hölscher,et al.  Virtual reality as an empirical research tool - Exploring user experience in a real building and a corresponding virtual model , 2015, Comput. Environ. Urban Syst..

[3]  Yi Lu,et al.  Can people memorize multilevel building as volumetric map? A study of multilevel atrium building , 2019 .

[4]  Alan Cooper,et al.  A feel for space , 1978 .

[5]  Beatrix Emo,et al.  Choice zones: architecturally relevant areas of interest , 2018, Spatial Cogn. Comput..

[6]  A. Turner,et al.  From Isovists to Visibility Graphs: A Methodology for the Analysis of Architectural Space , 2001 .

[7]  M. Benedikt,et al.  To Take Hold of Space: Isovists and Isovist Fields , 1979 .

[8]  Arzu Çöltekin,et al.  Using coefficient K to distinguish ambient/focal visual attention during map viewing , 2017 .

[9]  Arjun Kaicker,et al.  Enhancing Workplace Design through Advanced Floor Plate Analytics , 2019 .

[10]  Stefan Seer,et al.  Evaluation of indoor guidance systems using eye tracking in an immersive virtual environment , 2017, Spatial Cogn. Comput..

[11]  Simon J. Büchner,et al.  Gaze behaviour during space perception and spatial decision making , 2011, Psychological Research.

[12]  Bernd Ludwig,et al.  Where is the Landmark? Eye Tracking Studies in Large-Scale Indoor Environments , 2014, ET4S@GIScience.

[13]  Ruth Dalton,et al.  Spatial Predictors of Eye Movement in a Gallery Setting , 2013 .

[14]  B. Velichkovsky,et al.  Two Visual Systems and Their Eye Movements: Evidence from Static and Dynamic Scene Perception , 2005 .

[15]  Christian Derix,et al.  3d Isovists and Spatial Sensations: two methods and a case study , 2008 .

[16]  Christoph Hölscher,et al.  Location Dependent Fixation Analysis with Sight Vectors. Locomotion as a Challenge in Mobile Eye Tracking , 2014, ET4S@GIScience.

[17]  Victoria Hoban,et al.  The visitors. , 2005, Nursing times.

[18]  Andrew T. Duchowski,et al.  Discerning Ambient/Focal Attention with Coefficient K , 2016, ACM Trans. Appl. Percept..

[19]  Jakub Krukar,et al.  How the Visitors’ Cognitive Engagement Is Driven (but Not Dictated) by the Visibility and Co-visibility of Art Exhibits , 2020, Frontiers in Psychology.

[20]  Rul von Stülpnagel,et al.  Gaze behavior during incidental and intentional navigation in an outdoor environment , 2017, Spatial Cogn. Comput..

[21]  Bernd Ludwig,et al.  Evaluating indoor pedestrian navigation interfaces using mobile eye tracking , 2017, Spatial Cogn. Comput..