Eye-tracking for human-centered mixed reality: promises and challenges

Eye-tracking hardware and software are being rapidly integrated into mixed reality (MR) technology. Cognitive science and human-computer interaction (HCI) research demonstrate several ways eye-tracking can be used to gauge user characteristics, intent, and status as well as provide active and passive input control to MR interfaces. In this paper, we argue that eye-tracking can be used to ground MR technology in the cognitive capacities and intentions of users and that such human-centered MR is important for MR designers and engineers to consider. We detail relevant and timely research in eye-tracking and MR and offer suggestions and recommendations to accelerate the development of eye-tracking-enabled human-centered MR, with a focus on recent research findings. We identify several promises that eye-tracking holds for improving MR experiences. In the near term, these include user authentication, gross interface interactions, monitoring visual attention across real and virtual scene elements, and adaptive graphical rendering enabled by relatively coarse eyetracking metrics. In the far term, hardware and software advances will enable gaze-depth aware foveated MR displays and attentive MR user interfaces that track user intent and status using fine and dynamic aspects of gaze. Challenges, such as current technological limitations, difficulties in translating lab-based eye-tracking metrics to MR, and heterogeneous MR use cases are considered alongside cutting-edge research working to address them. With a focused research effort grounded in an understanding of the promises and challenges for eye-tracking, human-centered MR can be realized to improve the efficacy and user experience of MR.

[1]  Robert J. K. Jacob,et al.  Using fNIRS brain sensing to evaluate information visualization interfaces , 2013, CHI.

[2]  Shwetak N. Patel,et al.  EyeContact: scleral coil eye tracking for virtual reality , 2016, SEMWEB.

[3]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[4]  J. Hoffman,et al.  The role of visual attention in saccadic eye movements , 1995, Perception & psychophysics.

[5]  Joseph T. Coyne,et al.  Pupil Dilation and Task Adaptation , 2017, HCI.

[6]  Christophe Hurter,et al.  Eye Gesture in a Mixed Reality Environment , 2019, VISIGRAPP.

[7]  Hans Supèr,et al.  Attention-Related Eye Vergence Measured in Children with Attention Deficit Hyperactivity Disorder , 2015, PloS one.

[8]  Marcus A. Magnor,et al.  Gaze Guidance in Immersive Environments , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[9]  Benjamin Watson,et al.  Perceptually Driven Simplification Using Gaze-Directed Rendering , 2012 .

[10]  Hans-Werner Gellersen,et al.  Monocular gaze depth estimation using the vestibulo-ocular reflex , 2019, ETRA.

[11]  Steve Benford,et al.  Understanding and constructing shared spaces with mixed-reality boundaries , 1998, TCHI.

[12]  Francis Rumsey,et al.  On Some Biases Encountered in Modern Audio Quality Listening Tests-A Review , 2008 .

[13]  S. Martinez-Conde,et al.  From Exploration to Fixation: An Integrative View of Yarbus’s Vision , 2015, Perception.

[14]  K. Boff,et al.  User's Guide Engineering Data Compendium Human Perception and Performance , 1988 .

[15]  Eric Turner,et al.  Limits of peripheral acuity and implications for VR system design , 2018, Journal of the Society for Information Display.

[16]  Raimund Dachselt,et al.  Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets , 2013, CHI.

[17]  Marc Levoy,et al.  Gaze-directed volume rendering , 1990, I3D '90.

[18]  Dinesh Manocha,et al.  SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction , 2019, IEEE Transactions on Visualization and Computer Graphics.

[19]  Geoffrey Underwood,et al.  Eye Guidance and Visual Information Processing , 1998 .

[20]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[21]  Boris M. Velichkovsky,et al.  Towards gaze-mediated interaction: Collecting solutions of the "Midas touch problem" , 1997, INTERACT.

[22]  Andreas Bulling,et al.  A Design Space for Gaze Interaction on Head-mounted Displays , 2019, CHI.

[23]  Andreas Bulling,et al.  A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction , 2018, ETRA.

[24]  George L. Malcolm,et al.  Searching in the dark: Cognitive relevance drives attention in real-world scenes , 2009, Psychonomic bulletin & review.

[25]  Wolfgang Rosenstiel,et al.  500, 000 Images Closer to Eyelid and Pupil Segmentation , 2019, CAIP.

[26]  David Dunn,et al.  Required Accuracy of Gaze Tracking for Varifocal Displays , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[27]  Marcus Nyström,et al.  Eye tracker data quality: what it is and how to measure it , 2012, ETRA.

[28]  Cristina Conati,et al.  Individual user characteristics and information visualization: connecting the dots through eye tracking , 2013, CHI.

[29]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[30]  Yi-Ping Hung,et al.  Hybrid Method for 3-D Gaze Tracking Using Glint and Contour Features , 2015, IEEE Transactions on Circuits and Systems for Video Technology.

[31]  Effie J. Pereira,et al.  The influence of scene context on parafoveal processing of objects , 2018, Quarterly journal of experimental psychology.

[32]  Joohwan Kim,et al.  NVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimation , 2019, CHI.

[33]  Marcus Nyström,et al.  The influence of calibration method and eye physiology on eyetracking data quality , 2013, Behavior research methods.

[34]  John M. Findlay,et al.  Visual Attention: The Active Vision Perspective , 2001 .

[35]  P. Caffier,et al.  Experimental evaluation of eye-blink parameters as a drowsiness measure , 2003, European Journal of Applied Physiology.

[36]  Eyal M. Reingold,et al.  Empirical Evaluation of a Novel Gaze-Controlled Zooming Interface , 2001 .

[37]  Mark Billinghurst,et al.  Automated enabling of head mounted display using gaze-depth estimation , 2017, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[38]  S. P. Marshall,et al.  The Index of Cognitive Activity: measuring cognitive workload , 2002, Proceedings of the IEEE 7th Conference on Human Factors and Power Plants.

[39]  P. Subramanian Active Vision: The Psychology of Looking and Seeing , 2006 .

[40]  Oleg Spakov,et al.  Enhanced gaze interaction using simple head gestures , 2012, UbiComp.

[41]  Wolfgang Rosenstiel,et al.  PupilNet v2.0: Convolutional Neural Networks for CPU based real time Robust Pupil Detection , 2017, ArXiv.

[42]  Ignace T. C. Hooge,et al.  Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers , 2018, Royal Society Open Science.

[43]  Howell O. Istance,et al.  Zooming interfaces!: enhancing the performance of eye controlled pointing devices , 2002, Assets '02.

[44]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[45]  Peter Robinson,et al.  Learning an appearance-based gaze estimator from one million synthesised images , 2016, ETRA.

[46]  B. Velichkovsky,et al.  Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration , 2005 .

[47]  Thiago Santini,et al.  Get a grip: slippage-robust and glint-free gaze estimation for real-time pervasive head-mounted eye tracking , 2019, ETRA.

[48]  Peter Brooks,et al.  User measures of quality of experience: why being objective and quantitative is important , 2010, IEEE Network.

[49]  Marco Winckler,et al.  Human-Computer Interaction – INTERACT 2013 , 2013, Lecture Notes in Computer Science.

[50]  Fumio Kishino,et al.  Augmented reality: a class of displays on the reality-virtuality continuum , 1995, Other Conferences.

[51]  Andrew T. Duchowski,et al.  Gaze-Contingent Displays: A Review , 2004, Cyberpsychology Behav. Soc. Netw..

[52]  Yasuhito Sawahata,et al.  Optimizing Visual Element Placement via Visual Attention Analysis , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[53]  Enkelejda Kasneci,et al.  3D Gaze Estimation using Eye Vergence , 2016, HEALTHINF.

[54]  Philipp Slusallek,et al.  Foveated depth-of-field filtering in head-mounted displays , 2018, SAP.

[55]  Manuel Contero,et al.  A Proposal for the Selection of Eye-Tracking Metrics for the Implementation of Adaptive Gameplay in Virtual Reality Based Games , 2017, HCI.

[56]  Mtm Marc Lambooij,et al.  Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review , 2009 .

[57]  Hans-Peter Seidel,et al.  Saccade landing position prediction for gaze-contingent rendering , 2017, ACM Trans. Graph..

[58]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[59]  Jie Shen,et al.  Shape Constrained Network for Eye Segmentation in the Wild , 2019, 2020 IEEE Winter Conference on Applications of Computer Vision (WACV).

[60]  Scott Makeig,et al.  Eye Activity Correlates of Workload during a Visuospatial Memory Task , 2001, Hum. Factors.

[61]  K. Jellinger The Moving Tablet of the Eye: The Origins of Modern Eye Movement Research , 2006 .

[62]  Kang Ryoung Park,et al.  Iris Recognition in Wearable Computer , 2004, ICBA.

[63]  James J. Clark,et al.  Microsaccades as an overt measure of covert attention shifts , 2002, Vision Research.

[64]  Robert Desimone,et al.  Enhanced Neural Processing by Covert Attention only during Microsaccades Directed toward the Attended Stimulus , 2018, Neuron.

[65]  Daniel Sonntag,et al.  ModulAR: Eye-Controlled Vision Augmentations for Head Mounted Displays , 2015, IEEE Transactions on Visualization and Computer Graphics.

[66]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[67]  Joohwan Kim,et al.  Towards foveated rendering for gaze-tracked virtual reality , 2016, ACM Trans. Graph..

[68]  Kari-Jouko Räihä,et al.  Simple gaze gestures and the closure of the eyes as an interaction technique , 2012, ETRA.

[69]  Yasuhito Sawahata,et al.  Lost in Style: Gaze-driven Adaptive Aid for VR Navigation , 2019, CHI.

[70]  Joohwan Kim,et al.  Latency Requirements for Foveated Rendering in Virtual Reality , 2017, ACM Trans. Appl. Percept..

[71]  Benoît Bossavit,et al.  U2Eyes: A Binocular Dataset for Eye Tracking and Gaze Estimation , 2019, 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW).

[72]  Alois Ferscha,et al.  Towards 3D smooth pursuit interaction , 2019, UbiComp/ISWC Adjunct.

[73]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[74]  Peter R Murphy,et al.  Pupil Diameter Tracks Lapses of Attention , 2016, PloS one.

[75]  R. C. Langford How People Look at Pictures, A Study of the Psychology of Perception in Art. , 1936 .

[76]  Peter König,et al.  Eye movements as a window to cognitive processes , 2016 .

[77]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[78]  Frederick J. Brigham,et al.  The eyes may have it:Reconsidering eye-movement research in human cognition , 2001 .

[79]  Jiung-yao Huang,et al.  Augmented reality display based on user behavior , 2018, Comput. Stand. Interfaces.

[80]  S. McKee,et al.  The detection of motion in the peripheral visual field , 1984, Vision Research.

[81]  K. Rayner Eye movements and visual cognition : scene perception and reading , 1992 .

[82]  D. E. Irwin,et al.  Eyeblinks and cognition , 2011 .

[83]  Arindam Dey,et al.  Estimating Gaze Depth Using Multi-Layer Perceptron , 2017, 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).

[84]  Dian Artanto,et al.  Drowsiness detection system based on eye-closure using a low-cost EMG and ESP8266 , 2017, 2017 2nd International conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE).

[85]  Marc Alexa,et al.  Keep It Simple , 2019, ACM Trans. Appl. Percept..

[86]  Peter Olivieri,et al.  EagleEyes: An Eye Control System for Persons with Disabilities , 2013 .

[87]  Richard A. Bolt,et al.  A gaze-responsive self-disclosing display , 1990, CHI '90.

[88]  J. Anliker,et al.  Eye movements - On-line measurement, analysis, and control , 1976 .

[89]  Lester C. Loschky,et al.  Gaze-Contingent Multiresolutional Displays: An Integrative Review , 2003, Hum. Factors.

[90]  Tobias Höllerer,et al.  Spatio-Temporal Detection of Divided Attention in Reading Applications Using EEG and Eye Tracking , 2015, IUI.

[91]  Siyuan Chen,et al.  Using Task-Induced Pupil Diameter and Blink Rate to Infer Cognitive Load , 2014, Hum. Comput. Interact..

[92]  Meredith Ringel Morris,et al.  Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design , 2017, CHI.

[93]  Gerd Bruder,et al.  In the blink of an eye , 2018, ACM Trans. Graph..

[94]  R. Dodge,et al.  The angular velocity of eye movements , 1901 .

[95]  A. Wingfield,et al.  Pupillometry as a measure of cognitive effort in younger and older adults. , 2010, Psychophysiology.

[96]  Mingpo Yang,et al.  Microsaccade direction reflects the economic value of potential saccade goals and predicts saccade choice. , 2016, Journal of neurophysiology.

[97]  Daniel McDuff,et al.  Cardiolens: remote physiological monitoring in a mixed reality environment , 2017, SIGGRAPH Emerging Technologies.

[98]  S. Sutherland Eye, brain and vision , 1993, Nature.

[99]  Jeff B. Pelz,et al.  SemantiCode: using content similarity and database-driven matching to code wearable eyetracker gaze data , 2010, ETRA.

[100]  David M. Hoffman,et al.  Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. , 2008, Journal of vision.

[101]  E. Castet,et al.  Intrasaccadic perception triggers pupillary constriction , 2015, PeerJ.

[102]  Krzysztof Krejtz,et al.  Using Microsaccades to Estimate Task Difficulty During Visual Search of Layered Surfaces , 2020, IEEE Transactions on Visualization and Computer Graphics.

[103]  Eyal Ofek,et al.  Mise-Unseen: Using Eye Tracking to Hide Virtual Reality Scene Changes in Plain Sight , 2019, UIST.

[104]  Hans-Werner Gellersen,et al.  An Empirical Investigation of Gaze Selection in Mid-Air Gestural 3D Manipulation , 2015, INTERACT.

[105]  Desney S. Tan,et al.  Foveated 3D graphics , 2012, ACM Trans. Graph..

[106]  Arindam Dey,et al.  A Gaze-depth Estimation Technique with an Implicit and Continuous Data Acquisition for OST-HMDs , 2017, ICAT-EGVE.

[107]  Georgia Albuquerque,et al.  Towards VR Attention Guidance: Environment-dependent Perceptual Threshold for Stereo Inverse Brightness Modulation , 2019, SAP.

[108]  Henna Heikkilä,et al.  Tools for a Gaze-Controlled Drawing Application - Comparing Gaze Gestures against Dwell Buttons , 2013, INTERACT.

[109]  Jeff B. Pelz,et al.  RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking , 2019, 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW).

[110]  Hans Supèr,et al.  A Role of Eye Vergence in Covert Attention , 2013, PloS one.

[111]  Antti Oulasvirta,et al.  Too good to be bad: Favorable product expectations boost subjective usability ratings , 2011, Interact. Comput..

[112]  Glyn Lawson,et al.  The Relationship Between Presence and Trust in Virtual Reality , 2016, ECCE.

[113]  John P. McIntire,et al.  Use of head-worn sensors to detect lapses in vigilance through the measurement of PERCLOS and cerebral blood flow velocity , 2017, Defense + Security.

[114]  Martin Lochner,et al.  Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures , 2017, TACC.

[115]  Terry Winograd,et al.  GUIDe: gaze-enhanced UI design , 2007, CHI Extended Abstracts.

[116]  Martin Raubal,et al.  Eye tracking for spatial research: Cognition, computation, challenges , 2017, Spatial Cogn. Comput..

[117]  Philipp Slusallek,et al.  Predicting the gaze depth in head-mounted displays using multiple feature regression , 2018, ETRA.

[118]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[119]  Jonghyun Kim,et al.  Toward Standardized Classification of Foveated Displays , 2019, IEEE Transactions on Visualization and Computer Graphics.

[120]  Max L. Wilson,et al.  Brain activity and mental workload associated with artistic practice , 2018 .

[121]  Tad T. Brunyé,et al.  Eye tracking measures of uncertainty during perceptual decision making. , 2017, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[122]  Donald H. House,et al.  Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field , 2014, SAP.

[123]  Daniel Roggen,et al.  Recognition of visual memory recall processes using eye movement analysis , 2011, UbiComp '11.

[124]  Marc Pomplun,et al.  Biometric Identification Through Eye-Movement Patterns , 2017 .

[125]  T. Åkerstedt,et al.  Subjective sleepiness, simulated driving performance and blink duration: examining individual differences , 2006, Journal of sleep research.

[126]  Andrew T. Duchowski,et al.  A rotary dial for gaze-based PIN entry , 2016, ETRA.

[127]  Charalambos Poullis,et al.  Inattentional Blindness for Redirected Walking Using Dynamic Foveated Rendering , 2019, IEEE Access.

[128]  Andrew T. Duchowski,et al.  Gaze Transition Entropy , 2015, TAP.

[129]  Trafton Drew,et al.  A review of eye tracking for understanding and improving diagnostic interpretation , 2019, Cognitive Research: Principles and Implications.

[130]  Jason Jerald,et al.  The VR Book: Human-Centered Design for Virtual Reality , 2015 .

[131]  Fabian Hutmacher Why Is There So Much More Research on Vision Than on Any Other Sensory Modality? , 2019, Front. Psychol..

[132]  Poika Isokoski,et al.  Gazing and frowning as a new human--computer interaction technique , 2004, TAP.

[133]  Jason Orlosky Toward Parallel Consciousness: Classifying User State to Improve Augmentation Relevance , 2017, 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).

[134]  Marco Winckler,et al.  Human-Computer Interaction – INTERACT 2015 , 2015, Lecture Notes in Computer Science.

[135]  Wolfgang Rosenstiel,et al.  CBF: circular binary features for robust and real-time pupil center detection , 2018, ETRA.

[136]  Anjul Patney,et al.  Towards virtual reality infinite walking , 2018, ACM Trans. Graph..

[137]  André Frank Krause,et al.  Boosting speed- and accuracy of gradient based dark pupil tracking using vectorization and differential evolution , 2019, ETRA.

[138]  L. Carney,et al.  THE NATURE OF NORMAL BLINKING PATTERNS , 1982, Acta ophthalmologica.

[139]  Martin Tall,et al.  Computer Control by Gaze , 2012 .

[140]  S. Martinez-Conde,et al.  Neuroscience and Biobehavioral Reviews , 2022 .

[141]  D Kahneman,et al.  Pupil Diameter and Load on Memory , 1966, Science.

[142]  Siyuan Chen,et al.  Eye activity as a measure of human mental effort in HCI , 2011, IUI '11.

[143]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[144]  S. Steinhauer,et al.  Blink before and after you think: blinks occur prior to and following cognitive load indexed by pupillary responses. , 2008, Psychophysiology.

[145]  LUDWIG SIDENMARK,et al.  Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality , 2019, ACM Trans. Comput. Hum. Interact..

[146]  Morton Leonard Heilig,et al.  EL Cine del Futuro: The Cinema of the Future , 1992, Presence: Teleoperators & Virtual Environments.

[147]  THE ANGLE VELOCITY OF EYE MOVEMENTS , 2004 .

[148]  D. Schroeder,et al.  Blink Rate: A Possible Measure of Fatigue , 1994, Human factors.

[149]  Neil Dodgson,et al.  A fully-automatic , temporal approach to single camera , glint-free 3 D eye model fitting , 2013 .

[150]  Ann McNamara,et al.  Information Placement in Virtual Reality , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[151]  Andrew Hollingworth,et al.  Global Transsaccadic Change Blindness During Scene Perception , 2003, Psychological science.

[152]  Luca Mesin,et al.  A human-computer interface based on the "voluntary" pupil accommodative response , 2019, Int. J. Hum. Comput. Stud..

[153]  Susanne Boll,et al.  PrivacEye: privacy-preserving head-mounted eye tracking using egocentric scene image and eye movement features , 2018, ETRA.

[154]  Martin Raubal,et al.  Measuring Cognitive Load for Map Tasks Through Pupil Diameter , 2016, GIScience.

[155]  Hans-Werner Gellersen,et al.  Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation , 2019, CHI.

[156]  Barbara Gallina,et al.  Augmented Reality-extended Humans: Towards a Taxonomy of Failures – Focus on Visual Technologies , 2019, Proceedings of the 29th European Safety and Reliability Conference (ESREL).

[157]  Saeid Nahavandi,et al.  Exploring the Effect of Virtual Depth on Pupil Diameter , 2019, 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC).

[158]  Gregory Hughes,et al.  OpenEDS: Open Eye Dataset , 2019, ArXiv.

[159]  D. Burr,et al.  Selective suppression of the magnocellular visual pathway during saccadic eye movements , 1994, Nature.

[160]  Jorma Laaksonen,et al.  An augmented reality interface to contextual information , 2011, Virtual Reality.

[161]  Howell O. Istance,et al.  Gaze gestures or dwell-based interaction? , 2012, ETRA '12.

[162]  Stephen R. Mitroff,et al.  A taxonomy of errors in multiple-target visual search , 2013 .

[163]  Stephan Reichelt,et al.  Depth cues in human visual perception and their realization in 3D displays , 2010, Defense + Commercial Sensing.

[164]  Joohwan Kim,et al.  Foveated AR , 2019, ACM Trans. Graph..

[165]  Mathias Benedek,et al.  Eye Behavior Associated with Internally versus Externally Directed Cognition , 2017, Front. Psychol..

[166]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[167]  J. Edward Swan,et al.  Usability Engineering for Augmented Reality: Employing User-Based Studies to Inform Design , 2008, IEEE Transactions on Visualization and Computer Graphics.

[168]  Daniel Weiskopf,et al.  Visual Analytics for Mobile Eye Tracking , 2017, IEEE Transactions on Visualization and Computer Graphics.

[169]  Juan J. Cerrolaza,et al.  Error characterization and compensation in eye tracking systems , 2012, ETRA '12.

[170]  Wolfgang Rosenstiel,et al.  The Applicability of Cycle GANs for Pupil and Eyelid Segmentation, Data Generation and Image Refinement , 2019, 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW).

[171]  Hojjat Adeli,et al.  Biomedical Applications Based on Natural and Artificial Computing , 2017, Lecture Notes in Computer Science.

[172]  S. Martinez-Conde,et al.  The impact of microsaccades on vision: towards a unified theory of saccadic function , 2013, Nature Reviews Neuroscience.

[173]  Robert Xiao,et al.  Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions , 2015, ICMI.

[174]  M A Just,et al.  A theory of reading: from eye fixations to comprehension. , 1980, Psychological review.

[175]  Andrew T. Duchowski,et al.  Gaze-based interaction: A 30 year retrospective , 2018, Comput. Graph..

[176]  Gordon Wetzstein,et al.  Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays , 2017, Proceedings of the National Academy of Sciences.

[178]  Roman Bednarik,et al.  What do you want to do next: a novel approach for intent prediction in gaze-based interaction , 2012, ETRA.

[179]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[180]  Guanjun Tan,et al.  Foveated imaging for near-eye displays. , 2018, Optics express.

[181]  I. Scott MacKenzie,et al.  Speech-augmented eye gaze interaction with small closely spaced targets , 2006, ETRA.

[182]  Yusuke Sugano,et al.  Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.

[183]  Pushkar Shukla,et al.  3D gaze estimation in the scene volume with a head-mounted eye tracker , 2018, COGAIN@ETRA.

[184]  Joseph G. Johnson,et al.  Pupil Dilation and Eye Tracking , 2011 .

[185]  Joann G. Elmore,et al.  Eye-tracking for assessing medical image interpretation: A pilot feasibility study comparing novice vs expert cardiologists , 2019, Perspectives on Medical Education.

[186]  Roel Vertegaal,et al.  Designing attentive interfaces , 2002, ETRA.

[187]  Yong Hyub Won,et al.  Enhanced see-through near-eye display using time-division multiplexing of a Maxwellian-view and holographic display. , 2019, Optics express.

[188]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[189]  D. E. Irwin,et al.  Visual Memory Within and Across Fixations , 1992 .

[190]  Thies Pfeiffer Measuring and visualizing attention in space with 3D attention volumes , 2012, ETRA '12.

[191]  Aleksandra Kaszowska,et al.  Software Architecture for Automating Cognitive Science Eye-Tracking Data Analysis and Object Annotation , 2019, IEEE Transactions on Human-Machine Systems.

[192]  R. Schleicher,et al.  Blinks and saccades as indicators of fatigue in sleepiness warners: looking tired? , 2022 .

[193]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[194]  Kai Kunze,et al.  Transparent Reality: Using Eye Gaze Focus Depth as Interaction Modality , 2016, UIST.

[195]  Thomas Martinetz,et al.  Guidance of eye movements on a gaze-contingent display , 2004 .

[196]  Andreas Bulling,et al.  Towards a Symbiotic Human-Machine Depth Sensor: Exploring 3D Gaze for Object Reconstruction , 2018, UIST.

[197]  Daniel A. Gajewski,et al.  Differential detection of global luminance and contrast changes across saccades and flickers during active scene perception , 2008, Vision Research.

[198]  Aythami Morales,et al.  Saccade Landing Point Prediction: A Novel Approach based on Recurrent Neural Networks , 2018, ICMLT '18.

[199]  M F Land,et al.  The knowledge base of the oculomotor system. , 1997, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[200]  Ankit Mathur,et al.  Pupil shape as viewed along the horizontal visual field. , 2013, Journal of vision.

[201]  Andreas Bulling,et al.  A fast approach to refraction-aware eye-model fitting and gaze prediction , 2019, ETRA.

[202]  Francisco Javier Vera-Olmos,et al.  Deconvolutional Neural Network for Pupil Detection in Real-World Environments , 2017, IWINAC.

[203]  Roman Kuchkuda,et al.  An introduction to ray tracing , 1993, Comput. Graph..

[204]  D. Heeger,et al.  Spontaneous Microsaccades Reflect Shifts in Covert Attention , 2014, The Journal of Neuroscience.

[205]  Martin Raubal,et al.  The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation , 2018, CHI.

[206]  Fabian Hemmert,et al.  Perspective change: a system for switching between on-screen views by closing one eye , 2008, AVI '08.

[207]  Albrecht Schmidt,et al.  A Model Relating Pupil Diameter to Mental Workload and Lighting Conditions , 2016, CHI.

[208]  Yiguang Liu,et al.  A Geometry-Appearance-Based Pupil Detection Method for Near-Infrared Head-Mounted Cameras , 2018, IEEE Access.

[209]  Joohwan Kim,et al.  Perceptually-based foveated virtual reality , 2016, SIGGRAPH Emerging Technologies.

[210]  K. Hoffmann,et al.  Neural Mechanisms of Saccadic Suppression , 2002, Science.

[211]  Rebekka S. Renner,et al.  Saccadic peak velocity sensitivity to variations in mental workload. , 2010, Aviation, space, and environmental medicine.

[212]  Thies Pfeiffer,et al.  Benefits of Locating Overt Visual Attention in Space Using Binocular Eye Tracking for Mixed Reality Applications , 2009, Mensch & Computer Workshopband.

[213]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[214]  Robert J. K. Jacob,et al.  What you look at is what you get , 2016, Interactions.

[215]  D. Robinson,et al.  A METHOD OF MEASURING EYE MOVEMENT USING A SCLERAL SEARCH COIL IN A MAGNETIC FIELD. , 1963, IEEE transactions on bio-medical engineering.

[216]  H. Collewijn,et al.  Precise recording of human eye movements , 1975, Vision Research.

[217]  Robert W. Lindeman,et al.  Exploring natural eye-gaze-based interaction for immersive virtual reality , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[218]  Dan Witzner Hansen,et al.  Eye-based head gestures , 2012, ETRA.

[219]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[220]  Krzysztof Krejtz,et al.  Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze , 2018, PloS one.

[221]  K. Rayner,et al.  Effects of foveal priming and extrafoveal preview on object identification. , 1987, Journal of experimental psychology. Human perception and performance.

[222]  J Hyönä,et al.  Pupil Dilation as a Measure of Processing Load in Simultaneous Interpretation and Other Language Tasks , 1995, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[223]  Steve Howard,et al.  Human-Computer Interaction INTERACT ’97 , 1997, IFIP — The International Federation for Information Processing.

[224]  Glyn Lawson,et al.  Relationship Between Trust and Usability in Virtual Environments: An Ongoing Study , 2015, HCI.

[225]  Eugene Ch'ng,et al.  Evaluating virtual reality experience and performance: a brain based approach , 2016, VRCAI.

[226]  Gang Luo,et al.  Dynamic gaze-position prediction of saccadic eye movements using a Taylor series , 2017, Journal of vision.

[227]  K. Rayner Eye Movements and Visual Cognition , 1992 .

[228]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[229]  Peter Corcoran,et al.  Efficient CNN Implementation for Eye-Gaze Estimation on Low-Power/Low-Quality Consumer Imaging Systems , 2018, ArXiv.

[230]  Thiago Santini,et al.  Improving real-time CNN-based pupil detection through domain-specific data augmentation , 2019, ETRA.

[231]  Lester C. Loschky,et al.  User performance with gaze contingent multiresolutional displays , 2000, ETRA.

[232]  Helmut Hlavacs,et al.  Is virtual reality emotionally arousing? Investigating five emotion inducing virtual park scenarios , 2015, Int. J. Hum. Comput. Stud..

[233]  F. Toates,et al.  Accommodation function of the human eye. , 1972, Physiological reviews.

[234]  S K Rushton,et al.  Developing visual systems and exposure to virtual reality and stereo displays: some concerns and speculations about the demands on accommodation and vergence. , 1999, Applied ergonomics.

[235]  Jonathan Smallwood,et al.  Pupillometric Evidence for the Decoupling of Attention from Perceptual Input during Offline Thought , 2011, PloS one.

[236]  M. Golz,et al.  Evaluation of PERCLOS based current fatigue monitoring technologies , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[237]  John P. Frisby,et al.  Seeing, Second Edition: The Computational Approach to Biological Vision , 2010 .

[238]  Andrew T. Duchowski,et al.  Discerning Ambient/Focal Attention with Coefficient K , 2016, ACM Trans. Appl. Percept..

[239]  Howell O. Istance,et al.  Supporting Making Fixations and the Effect on Gaze Gesture Performance , 2017, CHI.

[240]  Arindam Dey,et al.  Empathic Mixed Reality: Sharing What You Feel and Interacting with What You See , 2017, 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).

[241]  Päivi Majaranta,et al.  Eye Movements and Human-Computer Interaction , 2019, Eye Movement Research.

[242]  Heinrich H. Bülthoff,et al.  Dynamic Perception: Workshop of the GI Section "Computer Vision" , 2004 .

[243]  Oleg V. Komogortsev,et al.  An implementation of eye movement-driven biometrics in virtual reality , 2018, ETRA.

[244]  Oleg V. Komogortsev,et al.  Instantaneous saccade driven eye gaze interaction , 2009, Advances in Computer Entertainment Technology.

[245]  Päivi Majaranta,et al.  Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies , 2011 .

[246]  Markus Werkle-Bergner,et al.  Microsaccade-related brain potentials signal the focus of visuospatial attention , 2015, NeuroImage.