GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays

Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situa-tions remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker's position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seam-less gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user's position and orientation to the display. In a user study with 12 partici-pants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.

[1]  Steven K. Feiner,et al.  My own private kiosk: privacy-preserving public displays , 2004, Eighth International Symposium on Wearable Computers.

[2]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[3]  Michael Rohs,et al.  The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.

[4]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[5]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[6]  Raimund Dachselt,et al.  Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets , 2013, CHI.

[7]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[8]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[9]  Pierre Vandergheynst,et al.  FREAK: Fast Retina Keypoint , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Steven K. Feiner,et al.  Virtual projection: exploring optical projection as a metaphor for multi-device interaction , 2012, CHI.

[11]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[12]  Juan J. Cerrolaza,et al.  Error characterization and compensation in eye tracking systems , 2012, ETRA '12.

[13]  Yoichi Sato,et al.  Vision-Based Face Tracking System for Large Displays , 2002, UbiComp.

[14]  Craig Hennessey,et al.  Long range eye tracking: bringing eye tracking into the living room , 2012, ETRA.

[15]  Roel Vertegaal,et al.  Attentive User Interfaces , 2003 .

[16]  Klaus Bengler,et al.  Implementing gaze control for peripheral devices , 2011, PETMEI '11.

[17]  Moshe Eizenman,et al.  A new methodology for determining point-of-gaze in head-mounted eye tracking systems , 2004, IEEE Transactions on Biomedical Engineering.

[18]  John Paulin Hansen,et al.  Gaze-based interaction with public displays using off-the-shelf components , 2010, UbiComp '10 Adjunct.

[19]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[20]  J. Stahl,et al.  Amplitude of human head movements associated with horizontal saccades , 1999, Experimental Brain Research.

[21]  Nicolai Marquardt,et al.  The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies , 2011, UIST.

[22]  Hans-Werner Gellersen,et al.  Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction , 2012, ETRA '12.

[23]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[24]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[25]  Daniel Jackson,et al.  Mobile Device and Intelligent Display Interaction via Scale-invariant Image Feature Matching , 2011, PECCS.

[26]  Alois Ferscha,et al.  Real-Time Gaze Tracking for Public Displays , 2010, AmI.

[27]  Albrecht Schmidt,et al.  Increasing the security of gaze-based cued-recall graphical passwords using saliency masks , 2012, CHI.

[28]  D. Guitton,et al.  Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. , 1987, Journal of neurophysiology.

[29]  Moshe Eizenman,et al.  A general framework for extension of a tracking range of user-calibration-free remote eye-gaze tracking systems , 2012, ETRA '12.

[30]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[31]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[32]  Andreas Butz,et al.  Touch projector: mobile interaction through video , 2010, CHI.

[33]  Daniel Jackson,et al.  Smart Phone Interaction with Registered Displays , 2009, IEEE Pervasive Computing.

[34]  Dan Witzner Hansen,et al.  Mobile gaze-based screen interaction in 3D environments , 2011, NGCA '11.