Exploring the Effect of Visual Cues on Eye Gaze During AR-Guided Picking and Assembly Tasks

In this paper, we present an analysis of eye gaze patterns pertaining to visual cues in augmented reality (AR) for head-mounted displays (HMDs). We conducted an experimental study involving a picking and assembly task, which was guided by different visual cues. We compare these visual cues along multiple dimensions (in-view vs. out-of-view, static vs. dynamic, sequential vs. simultaneous) and analyze quantitative metrics such as gaze distribution, gaze duration, and gaze path distance. Our results indicate that visual cues in AR significantly affect eye gaze patterns. Specifically, we show that the effect varies depending on the type of visual cue. We discuss these empirical results with respect to visual attention theory.

[1]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[2]  Xiaolong Wu,et al.  Order Picking with Head-Up Displays , 2015, Computer.

[3]  H. Scheffé A METHOD FOR JUDGING ALL CONTRASTS IN THE ANALYSIS OF VARIANCE , 1953 .

[4]  Dong Kyu Lee,et al.  What is the proper way to apply the multiple comparison test? , 2018, Korean journal of anesthesiology.

[5]  Kenneth Holmqvist,et al.  Visual Attention towards Gestures in Face-to-Face Interaction vs. on Screen , 2001, Gesture Workshop.

[6]  Sonja Stork,et al.  Human cognition in manual assembly: Theories and applications , 2010, Adv. Eng. Informatics.

[7]  J. Theeuwes Top-down and bottom-up control of visual selection. , 2010, Acta psychologica.

[8]  Susanne Boll,et al.  EyeSee360: designing a visualization technique for out-of-view objects in head-mounted augmented reality , 2017, SUI.

[9]  T. A. Kelley,et al.  Cortical mechanisms for shifting and holding visuospatial attention. , 2008, Cerebral cortex.

[10]  Karen M. Evans,et al.  Ego-motion compensation improves fixation detection in wearable eye tracking , 2012, ETRA.

[11]  Jiang,et al.  Applying Eye-Tracking Technology to Measure Interactive Experience Toward the Navigation Interface of Mobile Games Considering Different Visual Attention Mechanisms , 2019, Applied Sciences.

[12]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[13]  Dayang Rohaya Awang Rambli,et al.  Guidelines for the Interface Design of AR Systems for Manual Assembly , 2020 .

[14]  Linden J. Ball,et al.  Eye Tracking in Human-Computer Interaction and Usability Research : Current Status and Future Prospects , 2004 .

[15]  J. Wolfe,et al.  Guided Search 2.0 A revised model of visual search , 1994, Psychonomic bulletin & review.

[16]  Meredith Ringel Morris,et al.  Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design , 2017, CHI.

[17]  Joseph H. Goldberg,et al.  Computer interface evaluation using eye movements: methods and constructs , 1999 .

[18]  Joseph H. Goldberg,et al.  Eye tracking in web search tasks: design implications , 2002, ETRA.

[19]  Jaakko Hakulinen,et al.  Utilizing VR and Gaze Tracking to Develop AR Solutions for Industrial Maintenance , 2020, CHI.

[20]  Sascha Weber,et al.  Gaze3DFix: Detecting 3D fixations with an ellipsoidal bounding volume , 2018, Behavior research methods.

[21]  Patrick Baudisch,et al.  Evaluating visual cues for window switching on large screens , 2008, CHI.

[22]  John R. Anderson,et al.  Eye tracking the visual search of click-down menus , 1999, CHI '99.

[23]  Andrey Krekhov,et al.  Deadeye: A Novel Preattentive Visualization Technique Based on Dichoptic Presentation , 2019, IEEE Transactions on Visualization and Computer Graphics.

[24]  Yuanzhen Li,et al.  Measuring visual clutter. , 2007, Journal of vision.

[25]  Jeremy M Wolfe,et al.  Visual Search: How Do We Find What We Are Looking For? , 2020, Annual review of vision science.

[26]  Thies Pfeiffer,et al.  Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[27]  Carl Gutwin,et al.  Wedge: clutter-free visualization of off-screen locations , 2008, CHI.

[28]  Avi Caspi,et al.  The decoupling of attention and eye movements during multiple fixation search , 2004 .

[29]  Susanne Boll,et al.  Visualizing out-of-view objects in head-mounted augmented reality , 2017, MobileHCI.

[30]  Yasuhito Sawahata,et al.  Lost in Style: Gaze-driven Adaptive Aid for VR Navigation , 2019, CHI.

[31]  M. Posner,et al.  Orienting of Attention* , 1980, The Quarterly journal of experimental psychology.

[32]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[33]  Ann McNamara,et al.  Subtle gaze direction , 2009, TOGS.

[34]  E. Goldstein Blackwell handbook of perception , 2001 .

[35]  Frank Biocca,et al.  Attention funnel: omnidirectional 3D cursor for mobile augmented reality platforms , 2006, CHI.

[36]  Thierry Baccino,et al.  UX Heatmaps: Mapping User Experience on Visual Interfaces , 2016, CHI.

[37]  Susanne Boll,et al.  FlyingARrow: Pointing Towards Out-of-View Objects on Augmented Reality Devices , 2018, PerDis.