InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation

Analysis of everyday human gaze behaviour has significant potential for ubiquitous computing, as evidenced by a large body of work in gaze-based human-computer interaction, attentive user interfaces, and eye-based user modelling. However, current mobile eye trackers are still obtrusive, which not only makes them uncomfortable to wear and socially unacceptable in daily life, but also prevents them from being widely adopted in the social and behavioural sciences. To address these challenges we present InvisibleEye, a novel approach for mobile eye tracking that uses millimetre-size RGB cameras that can be fully embedded into normal glasses frames. To compensate for the cameras’ low image resolution of only a few pixels, our approach uses multiple cameras to capture different views of the eye, as well as learning-based gaze estimation to directly regress from eye images to gaze directions. We prototypically implement our system and characterise its performance on three large-scale, increasingly realistic, and thus challenging datasets: 1) eye images synthesised using a recent computer graphics eye region model, 2) real eye images recorded of 17 participants under controlled lighting, and 3) eye images recorded of four participants over the course of four recording sessions in a mobile setting. We show that InvisibleEye achieves a top person-specific gaze estimation accuracy of 1.79° using four cameras with a resolution of only 5 × 5 pixels. Our evaluations not only demonstrate the feasibility of this novel approach but, more importantly, underline its significant potential for finally realising the vision of invisible mobile eye tracking and pervasive attentive user interfaces.

[1]  Fernando Morgado-Dias,et al.  NanEye-An Endoscopy Sensor With 3-D Image Synchronization , 2017, IEEE Sensors Journal.

[2]  Roman Bednarik,et al.  What do you want to do next: a novel approach for intent prediction in gaze-based interaction , 2012, ETRA.

[3]  Ieee Xplore,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence Information for Authors , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Gerhard Tröster,et al.  Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments , 2009, J. Ambient Intell. Smart Environ..

[5]  Peter Robinson,et al.  Rendering of Eyes for Eye-Shape Registration and Gaze Estimation , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[6]  Raimund Dachselt,et al.  Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets , 2013, CHI.

[7]  Hans-Werner Gellersen,et al.  Pursuits: eye-based interaction with moving targets , 2013, CHI Extended Abstracts.

[8]  O.K. Tonguz,et al.  A High Speed Eye Tracking System with Robust Pupil Center Estimation Algorithm , 2007, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[9]  Takeo Kanade,et al.  Illumination-free gaze estimation method for first-person vision wearable device , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[10]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Mario Fritz,et al.  Visual Decoding of Targets During Visual Search From Human Eye Fixations , 2017, ArXiv.

[12]  Peter Robinson,et al.  A 3D Morphable Eye Region Model for Gaze Estimation , 2016, ECCV.

[13]  Martín Abadi,et al.  TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.

[14]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Rainer Stiefelhagen,et al.  Tracking head pose and focus of attention with multiple far-field cameras , 2006, ICMI '06.

[16]  Norihiro Hagita,et al.  Gaze tracking in wide area using multiple camera observations , 2012, ETRA '12.

[17]  Gerhard Tröster,et al.  Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography , 2009, Pervasive.

[18]  Irfan Essa,et al.  Fast Multiple Camera Head Pose Tracking , 2003 .

[19]  Gerhard Tröster,et al.  What's in the Eyes for Context-Awareness? , 2011, IEEE Pervasive Computing.

[20]  Yusuke Sugano,et al.  Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.

[21]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[22]  Jean-Philippe Thiran,et al.  Robust gaze estimation based on adaptive fusion of multiple cameras , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[23]  Neil A. Dodgson,et al.  Robust real-time pupil tracking in highly off-axis images , 2012, ETRA.

[24]  Ömer Nezih Gerek,et al.  A Low-Computational Approach on Gaze Estimation With Eye Touch System , 2014, IEEE Transactions on Cybernetics.

[25]  Wolfgang Rosenstiel,et al.  ExCuSe: Robust Pupil Detection in Real-World Scenarios , 2015, CAIP.

[26]  Thiago Santini,et al.  ElSe: ellipse selection for robust pupil detection in real-world environments , 2015, ETRA.

[27]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[28]  Carlos Hitoshi Morimoto,et al.  Episcleral surface tracking: challenges and possibilities for using mice sensors for wearable eye tracking , 2016, ETRA.

[29]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[30]  Hans-Werner Gellersen,et al.  EyeContext: recognition of high-level contextual cues from human visual behaviour , 2013, CHI.

[31]  Yoram Singer,et al.  Adaptive Subgradient Methods for Online Learning and Stochastic Optimization , 2011, J. Mach. Learn. Res..

[32]  Dieter Schmalstieg,et al.  Hybrid Eye Tracking: Combining Iris Contour and Corneal Imaging , 2015, ICAT-EGVE.

[33]  Andreas Bulling,et al.  Recognition of curiosity using eye movement analysis , 2015, UbiComp/ISWC Adjunct.

[34]  Mario Fritz,et al.  Appearance-based gaze estimation in the wild , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[35]  Andreas Bulling,et al.  Discovery of everyday human activities from long-term visual behaviour using topic models , 2015, UbiComp.

[36]  Evan F. Risko,et al.  Eyes wide shut: implied social presence, eye tracking and attention , 2011, Attention, perception & psychophysics.

[37]  Andreas Bulling,et al.  Cognition-Aware Computing , 2014, IEEE Pervasive Computing.

[38]  Yusuke Sugano,et al.  Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments , 2015, ETRA.

[39]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[40]  Roope Raisamo,et al.  Glance Awareness and Gaze Interaction in Smartwatches , 2015, CHI Extended Abstracts.

[41]  Gjergji Kasneci,et al.  PupilNet: Convolutional Neural Networks for Robust Pupil Detection , 2016, ArXiv.

[42]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[43]  Yifan Peng,et al.  Modelling eye movements in a categorical search task , 2013, Philosophical Transactions of the Royal Society B: Biological Sciences.

[44]  Hans-Werner Gellersen,et al.  Orbits: enabling gaze interaction in smart watches using moving targets , 2015, UbiComp/ISWC Adjunct.

[45]  Ioannis Rigas,et al.  Eye Movement Biometrics on Wearable Devices: What Are the Limits? , 2016, CHI Extended Abstracts.

[46]  Neil A. Dodgson,et al.  Rendering synthetic ground truth images for eye tracker evaluation , 2014, ETRA.

[47]  Antonio Krüger,et al.  GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays , 2015, UIST.

[48]  Gerhard Tröster,et al.  It’s in Your Eyes - Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles , 2008 .

[49]  Mario Fritz,et al.  It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[50]  Atsushi Nakazawa,et al.  Point of Gaze Estimation through Corneal Surface Reflection in an Active Illumination Environment , 2012, ECCV.

[51]  Gregory D. Abowd,et al.  Perceptual user interfaces using vision-based eye tracking , 2003, ICMI '03.

[52]  Neil Dodgson,et al.  A fully-automatic , temporal approach to single camera , glint-free 3 D eye model fitting , 2013 .

[53]  Masaaki Fukumoto,et al.  Full-time wearable headphone-type gaze detector , 2006, CHI Extended Abstracts.

[54]  Yoichi Sato,et al.  Learning-by-Synthesis for Appearance-Based 3D Gaze Estimation , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[55]  Takahiro Okabe,et al.  Adaptive Linear Regression for Appearance-Based Gaze Estimation , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[56]  Takeo Kanade,et al.  Unwrapping the eye for visible-spectrum gaze tracking on wearable devices , 2013, 2013 IEEE Workshop on Applications of Computer Vision (WACV).

[57]  Yusuke Sugano,et al.  3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers , 2016, ETRA.

[58]  Jörg Müller,et al.  GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.

[59]  Jun Rekimoto,et al.  Aided eyes: eye activity sensing for daily life , 2010, AH.

[60]  Mario Fritz,et al.  Prediction of search targets from fixations in open-world settings , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[61]  Aude Billard,et al.  A wearable gaze tracking system for children in unconstrained environments , 2011, Comput. Vis. Image Underst..

[62]  Tom Foulsham,et al.  Wearable computing: Will it make people prosocial? , 2015, British journal of psychology.

[63]  Andreas Bulling,et al.  Prediction of gaze estimation error for error-aware gaze-based interfaces , 2016, ETRA.

[64]  Quan Wang,et al.  Development of an untethered, mobile, low-cost head-mounted eye tracker , 2014, ETRA.

[65]  Shumeet Baluja,et al.  Non-Intrusive Gaze Tracking Using Artificial Neural Networks , 1993, NIPS.

[66]  Pan Hu,et al.  iShadow: design of a wearable, real-time mobile gaze tracker , 2014, MobiSys.

[67]  Yusuke Sugano,et al.  AggreGaze: Collective Estimation of Audience Attention on Public Displays , 2016, UIST.

[68]  Peter Robinson,et al.  Learning an appearance-based gaze estimator from one million synthesised images , 2016, ETRA.

[69]  Zahra Hakimi,et al.  SET: a pupil detection method using sinusoidal approximation , 2015, Front. Neuroeng..

[70]  Dongheng Li,et al.  Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[71]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[72]  Wojciech Matusik,et al.  Eye Tracking for Everyone , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[73]  Daniel Roggen,et al.  Recognition of visual memory recall processes using eye movement analysis , 2011, UbiComp '11.