Eyemirror: mobile calibration-free gaze approximation using corneal imaging

Gaze is a powerful measure of people's attracted attention and reveals where we are looking within our current FOV. Hence gaze-based interfaces are gaining in importance. However, gaze estimation usually requires extensive hardware and depends on a calibration that has to be renewed regularly. We present EyeMirror, a mobile device for calibration-free gaze approximation on surfaces (e.g., displays). It consists of a head-mounted camera, connected to a wearable mini-computer, capturing the environment reflected on the human cornea. The corneal images are analyzed using natural feature tracking for gaze estimation on surfaces. In two lab studies we compared variations of EyeMirror against established methods for gaze estimation in a display scenario, and investigated the effect of display content (i.e. number of features). EyeMirror achieved 4.03° gaze estimation error, while we found no significant effect of display content.

[1]  Roel Vertegaal,et al.  ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis , 2005, UIST.

[2]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[3]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[4]  Tsukasa Ogasawara,et al.  Gaze tracking using corneal images captured by a single high-sensitivity camera , 2016 .

[5]  Atsushi Nakazawa,et al.  Non-calibrated and real-time human view estimation using a mobile corneal imaging camera , 2015, 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[6]  L. Young,et al.  Survey of eye movement recording methods , 1975 .

[7]  Antonio Krüger,et al.  "The story of life is quicker than the blink of an eye": using corneal imaging for life logging , 2016, UbiComp Adjunct.

[8]  Gudrun Klinker,et al.  Corneal-Imaging Calibration for Optical See-Through Head-Mounted Displays , 2015, IEEE Transactions on Visualization and Computer Graphics.

[9]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Roel Vertegaal,et al.  Attentive User Interfaces , 2003 .

[11]  Klaus Bengler,et al.  Implementing gaze control for peripheral devices , 2011, PETMEI '11.

[12]  Shree K. Nayar,et al.  Using eye reflections for face recognition under varying illumination , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[13]  Hans-Werner Gellersen,et al.  Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks , 2015, CHI.

[14]  Atsushi Nakazawa,et al.  Display-camera calibration using eye reflections and geometry constraints , 2011, Comput. Vis. Image Underst..

[15]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[16]  Rafael Cabeza,et al.  A Novel Gaze Estimation System With One Calibration Point , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[17]  Ali Farhadi,et al.  YOLO9000: Better, Faster, Stronger , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[18]  Christian Nitschke,et al.  Image-based Eye Pose and Reflection Analysis for Advanced Interaction Techniques and Scene Understanding , 2011 .

[19]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[20]  Michael Backes,et al.  2008 IEEE Symposium on Security and Privacy Compromising Reflections –or– How to Read LCD Monitors Around the Corner , 2022 .

[21]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[22]  Θεοφανώ Παναγή Ανίχνευση και επεξεργασία ανθρώπινης εστίασης με την χρήση των γυαλιών παρακολούθησης ματιών TOBII PRO GLASSES 2 , 2016 .

[23]  Pierre Vandergheynst,et al.  FREAK: Fast Retina Keypoint , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[24]  Antonio Krüger,et al.  GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays , 2015, UIST.

[25]  Atsushi Nakazawa,et al.  Corneal Imaging Revisited: An Overview of Corneal Reflection Analysis and Applications , 2013, IPSJ Trans. Comput. Vis. Appl..

[26]  Kentaro Takemura,et al.  Estimating point-of-regard using corneal surface image , 2014, ETRA.

[27]  Omar Mubin,et al.  How Not to Become a Buffoon in Front of a Shop Window: A Solution Allowing Natural Head Movement for Interaction with a Public Display , 2009, INTERACT.

[28]  Shree K. Nayar,et al.  Corneal Imaging System: Environment from Eyes , 2006, International Journal of Computer Vision.

[29]  Oliver Grau,et al.  Concept for using eye tracking in a head-mounted display to adapt rendering to the user's current visual field , 2016, VRST.

[30]  Antonio Krüger,et al.  A time-efficient re-calibration algorithm for improved long-term accuracy of head-worn eye trackers , 2016, ETRA.

[31]  Erhardt Barth,et al.  Accurate Eye Centre Localisation by Means of Gradients , 2011, VISAPP.

[32]  D. Guitton,et al.  Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. , 1987, Journal of neurophysiology.

[33]  Jun Takamatsu,et al.  Estimating Focused Object using Corneal Surface Image for Eye-based Interaction , 2013 .

[34]  Sven Gehring,et al.  Using corneal imaging for measuring a human's visual attention , 2017, UbiComp/ISWC Adjunct.

[35]  Shree K. Nayar,et al.  Eyes for relighting , 2004, ACM Trans. Graph..

[36]  Qiang Ji,et al.  Probabilistic gaze estimation without active personal calibration , 2011, CVPR 2011.

[37]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[38]  Hans-Werner Gellersen,et al.  Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction , 2012, ETRA '12.

[39]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[40]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[41]  Kwan-Yee Kenneth Wong,et al.  Reconstruction of display and eyes from a single image , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[42]  Andreas Bulling,et al.  EyeTab: model-based gaze estimation on unmodified tablet computers , 2014, ETRA.

[43]  Shree K. Nayar,et al.  The World in an Eye , 2004, CVPR.

[44]  Atsushi Nakazawa,et al.  I See What You See: Point of Gaze Estimation from Corneal Images , 2013, 2013 2nd IAPR Asian Conference on Pattern Recognition.

[45]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[46]  Atsushi Nakazawa,et al.  Point of Gaze Estimation through Corneal Surface Reflection in an Active Illumination Environment , 2012, ECCV.

[47]  David G. Lowe,et al.  Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration , 2009, VISAPP.