Towards a Symbiotic Human-Machine Depth Sensor: Exploring 3D Gaze for Object Reconstruction

Eye tracking is expected to become an integral part of future augmented reality (AR) head-mounted displays (HMDs) given that it can easily be integrated into existing hardware and provides a versatile interaction modality. To augment objects in the real world, AR HMDs require a three-dimensional understanding of the scene, which is currently solved using depth cameras. In this work we aim to explore how 3D gaze data can be used to enhance scene understanding for AR HMDs by envisioning a symbiotic human-machine depth camera, fusing depth data with 3D gaze information. We present a first proof of concept, exploring to what extend we are able to recognise what a user is looking at by plotting 3D gaze data. To measure 3D gaze, we implemented a vergence-based algorithm and built an eye tracking setup consisting of a Pupil Labs headset and an OptiTrack motion capture system, allowing us to measure 3D gaze inside a 50x50x50 cm volume. We show first 3D gaze plots of "gazed-at" objects and describe our vision of a symbiotic human-machine depth camera that combines a depth camera and human 3D gaze information.

[1]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Hans-Werner Gellersen,et al.  Gaze + pinch interaction in virtual reality , 2017, SUI.

[3]  Pushkar Shukla,et al.  3D gaze estimation in the scene volume with a head-mounted eye tracker , 2018, COGAIN@ETRA.

[4]  Kent Lyons,et al.  Looking at or through?: using eye tracking to infer attention location for wearable transparent displays , 2014, SEMWEB.

[5]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[6]  S. B. Hutton,et al.  Eye Tracking Methodology , 2019, Eye Movement Research.

[7]  Thies Pfeiffer,et al.  EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology , 2014, ETRA.

[8]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[9]  Donald H. House,et al.  Online 3D Gaze Localization on Stereoscopic Displays , 2014, TAP.

[10]  Andreas Bulling,et al.  On the Verge: Voluntary Convergences for Accurate and Precise Timing of Gaze Input , 2016, CHI Extended Abstracts.

[11]  Kang Ryoung Park,et al.  3D gaze tracking method using Purkinje images on eye optical model and pupil , 2012 .

[12]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[13]  Enkelejda Kasneci,et al.  3D Gaze Estimation using Eye Vergence , 2016, HEALTHINF.

[14]  Walterio W. Mayol-Cuevas,et al.  3D from looking: using wearable gaze tracking for hands-free and feedback-free object modelling , 2013, ISWC '13.