EyeFrame: Real-Time Memory Aid Improves Human Multitasking via Domain-General Eye Tracking Procedures

OBJECTIVE: We developed an extensively general closed-loop system to improve human interaction in various multitasking scenarios, with semi-autonomous agents, processes, and robots. BACKGROUND: Much technology is converging toward semi-independent processes with intermittent human supervision distributed over multiple computerized agents. Human operators multitask notoriously poorly, in part due to cognitive load and limited working memory. To multitask optimally, users must remember task order, e.g., the most neglected task, since longer times not monitoring an element indicates greater probability of need for user input. The secondary task of monitoring attention history over multiple spatial tasks requires similar cognitive resources as primary tasks themselves. Humans can not reliably make more than ~2 decisions/s. METHODS: Participants managed a range of 4-10 semi-autonomous agents performing rescue tasks. To optimize monitoring and controlling multiple agents, we created an automated short term memory aid, providing visual cues from users' gaze history. Cues indicated when and where to look next, and were derived from an inverse of eye fixation recency. RESULTS: Contingent eye tracking algorithms drastically improved operator performance, increasing multitasking capacity. The gaze aid reduced biases, and reduced cognitive load, measured by smaller pupil dilation. CONCLUSIONS: Our eye aid likely helped by delegating short-term memory to the computer, and by reducing decision making load. Past studies used eye position for gaze-aware control and interactive updating of displays in application-specific scenarios, but ours is the first to successfully implement domain-general algorithms. Procedures should generalize well to: process control, factory operations, robot control, surveillance, aviation, air traffic control, driving, military, mobile search and rescue, and many tasks where probability of utility is predicted by duration since last attention to a task.

[1]  Ken Funk,et al.  A Functional Model of Flightdeck Agenda Management , 1996 .

[2]  D. Alan Allport,et al.  SHIFTING INTENTIONAL SET - EXPLORING THE DYNAMIC CONTROL OF TASKS , 1994 .

[3]  A. Wingfield,et al.  Pupillometry as a measure of cognitive effort in younger and older adults. , 2010, Psychophysiology.

[4]  Minho Lee,et al.  Probing of human implicit intent based on eye movement and pupillary analysis for augmented cognition , 2013, Int. J. Imaging Syst. Technol..

[5]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[6]  J M Hammer HUMAN FACTORS OF FUNCTIONALITY AND INTELLIGENT AVIONICS. IN: HANDBOOK OF AVIATION HUMAN FACTORS , 1999 .

[7]  Sandra P. Marshall,et al.  Integrating psychophysiological measures of cognitive workload and eye movements to detect strategy shifts , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[8]  Veronica Sundstedt,et al.  Gazing at Games: An Introduction to Eye Tracking Control , 2012, Gazing at Games: An Introduction to Eye Tracking Control.

[9]  Greg Mori,et al.  HRI in the sky: Creating and commanding teams of UAVs with a vision-mediated gestural interface , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Oleg Spakov,et al.  Gaze-based selection of standard-size menu items , 2005, ICMI '05.

[11]  R. C. Langford How People Look at Pictures, A Study of the Psychology of Perception in Art. , 1936 .

[12]  Renwick E. Curry,et al.  Flight-deck automation: promises and problems , 1980 .

[13]  D Kahneman,et al.  Pupil Diameter and Load on Memory , 1966, Science.

[14]  Brian P. Bailey,et al.  Using Eye Gaze Patterns to Identify User Tasks , 2004 .

[15]  J. Beatty,et al.  Pupillary responses in a pitch-discrimination task , 1967 .

[16]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[17]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[18]  张静,et al.  Banana Ovate family protein MaOFP1 and MADS-box protein MuMADS1 antagonistically regulated banana fruit ripening , 2015 .

[19]  R. Parasuraman Vigilance, monitoring, and search. , 1986 .

[20]  J. C. Johnston,et al.  Attention and performance. , 2001, Annual review of psychology.

[21]  K J Rothman,et al.  No Adjustments Are Needed for Multiple Comparisons , 1990, Epidemiology.

[22]  S. P. Marshall,et al.  The Index of Cognitive Activity: measuring cognitive workload , 2002, Proceedings of the IEEE 7th Conference on Human Factors and Power Plants.

[23]  Roel Vertegaal,et al.  EyeWindows : Using Eye-Controlled Zooming Windows for Focus Selection , 2004 .

[24]  Aulikki Hyrskykari,et al.  Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid , 2006 .

[25]  Roy Stripling,et al.  Augmented Cognition Overview: Improving Information Intake under Stress , 2004 .

[26]  Tibor Bosse,et al.  Augmented Metacognition Addressing Dynamic Allocation of Tasks Requiring Visual Attention , 2007, HCI.

[27]  A. Zekveld,et al.  Cognitive processing load across a wide range of listening conditions: insights from pupillometry. , 2014, Psychophysiology.

[28]  Joseph H. Goldberg,et al.  Eye-Gaze Control of the Computer Interface: Discrimination of Zoom Intent , 1993 .

[29]  Deb Roy,et al.  Visual memory augmentation: using eye gaze as an attention filter , 2004, Eighth International Symposium on Wearable Computers.

[30]  Jennifer J. Vogel-Walcutt,et al.  A review of eye-tracking applications as tools for training , 2012, Cognition, Technology & Work.

[31]  Jerome L. Myers,et al.  Research Design and Statistical Analysis: Third Edition , 1991 .

[32]  Sven Laqua,et al.  GazeSpace: eye gaze controlled content spaces , 2007, BCS HCI.

[33]  Tjerk de Greef,et al.  Augmenting Cognition: Reviewing the Symbiotic Relation Between Man and Machine , 2007, HCI.

[34]  Stefan M. Wierda,et al.  Pupil dilation deconvolution reveals the dynamics of attention at high temporal resolution , 2012, Proceedings of the National Academy of Sciences.

[35]  K Rayner,et al.  Reading without a fovea. , 1979, Science.

[36]  Christopher D. Wickens,et al.  Computational Models of Human Performance in the Design and Layout of Controls , 1997 .

[37]  Santosh Mathan,et al.  Automation Etiquette in the Augmented Cognition Context , 2005 .

[38]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[39]  E D Megaw,et al.  Target Uncertainty and Visual Scanning Strategies , 1979, Human factors.

[40]  James J. Clark,et al.  Video game design using an eye-movement-dependent model of visual attention , 2008, TOMCCAP.

[41]  Jerome L. Myers,et al.  Research Design and Statistical Analysis , 1991 .

[42]  Michael A. Peshkin,et al.  Mental Transformations in Human-Robot Interaction , 2011 .

[43]  Robert M. Taylor,et al.  From Safety Net to Augmented Cognition: Using Flexible Autonomy Levels for On-Line Cognitive Assistance and Automation , 2003 .

[44]  Hao Jiang,et al.  Personalized online document, image and video recommendation via commodity eye-tracking , 2008, RecSys '08.

[45]  S. Reder On-line monitoring of eye-position signals in contingent and noncontingent paradigms , 1973 .

[46]  Joseph T. Coyne,et al.  Applying Real Time Physiological Measures of Cognitive Load to Improve Training , 2009, HCI.

[47]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[48]  Xingshan Li,et al.  Search for two categories of target produces fewer fixations to target-color items. , 2012, Journal of experimental psychology. Applied.

[49]  S. Sirois,et al.  Pupillometry , 2012, Perspectives on psychological science : a journal of the Association for Psychological Science.

[50]  Jodi Forlizzi,et al.  Psycho-physiological measures for assessing cognitive load , 2010, UbiComp.

[51]  Takehiko Ohno,et al.  EyePrint: Using Passive Eye Trace From Reading to Enhance Document Access and Comprehension , 2007, Int. J. Hum. Comput. Interact..

[52]  J. Bradshaw Pupil Size and Problem Solving , 1968, The Quarterly journal of experimental psychology.

[53]  Joseph H. Goldberg,et al.  Eye-gaze-contingent control of the computer interface: Methodology and example for zoom detection , 1995 .

[54]  A K Pradhan,et al.  The view from the road: the contribution of on-road glance-monitoring technologies to understanding driver behavior. , 2013, Accident; analysis and prevention.

[55]  S. Shimojo,et al.  Gaze bias both reflects and influences preference , 2003, Nature Neuroscience.

[56]  Cheng Zhang,et al.  An Eye-Gaze Tracking and Human Computer Interface System for People with ALS and other Locked-in Diseases , 2012 .

[57]  Greg Mori,et al.  Selecting and Commanding Individual Robots in a Multi-Robot System , 2010, 2010 Canadian Conference on Computer and Robot Vision.

[58]  Dario D. Salvucci Inferring intent in eye-based interfaces: tracing eye movements with process models , 1999, CHI '99.

[59]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[60]  K. J. Craik Theory of the human operator in control systems; man as an element in a control system. , 1948, British Journal of Psychology General Section.

[61]  Jakob Nielsen,et al.  Eyetracking Web Usability , 2009 .

[62]  E. Granholm,et al.  Pupillary responses index cognitive resource limitations. , 1996, Psychophysiology.

[63]  R.J.K. Jacob,et al.  Hot topics-eye-gaze computer interfaces: what you look at is what you get , 1993, Computer.

[64]  M. Just,et al.  Neuroindices of cognitive workload: Neuroimaging, pupillometric and event-related potential studies of brain work , 2003 .

[65]  J. I. Elkind,et al.  Transmission of Information in Simple Manual Control Systems , 1961 .

[66]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[67]  Takehiko Ohno EyePrint: support of document browsing with eye gaze trace , 2004, ICMI '04.

[68]  G. McConkie,et al.  The span of the effective stimulus during a fixation in reading , 1975 .

[69]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[70]  D. Strayer,et al.  Supertaskers: Profiles in extraordinary multitasking ability , 2010, Psychonomic bulletin & review.

[71]  Päivi Majaranta,et al.  Proactive Response to Eye Movements , 2003, INTERACT.

[72]  B. Goldwater Psychological significance of pupillary movements. , 1972, Psychological bulletin.

[73]  Misha Pavel,et al.  Augmented cognition: allocation of attention , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[74]  Richard A. Bolt,et al.  A gaze-responsive self-disclosing display , 1990, CHI '90.

[75]  D. Kahneman Pupillary Responses in a Pitch-Discrimination Task , 1967 .

[76]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[77]  D. Kahneman,et al.  Attention and Effort , 1973 .

[78]  Andreas Dengel,et al.  Attention-Based Document Classifier Learning , 2008, 2008 The Eighth IAPR International Workshop on Document Analysis Systems.

[79]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[80]  Cristina Conati,et al.  Eye-tracking to model and adapt to user meta-cognition in intelligent learning environments , 2006, IUI '06.

[81]  Sher ry Folsom-Meek,et al.  Human Performance , 2020, Nature.

[82]  D. Strayer,et al.  Who Multi-Tasks and Why? Multi-Tasking Ability, Perceived Multi-Tasking Ability, Impulsivity, and Sensation Seeking , 2013, PloS one.

[83]  Santosh Mathan,et al.  Building Honeywell's Adaptive System for the Augmented Cognition Program , 2005 .

[84]  Maria L. Thomas,et al.  Monitoring and predicting cognitive state and performance via physiological correlates of neuronal signals. , 2005, Aviation, space, and environmental medicine.

[85]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[86]  David A. Kobus,et al.  Overview of the DARPA Augmented Cognition Technical Integration Experiment , 2004, Int. J. Hum. Comput. Interact..

[87]  R. Feise Do multiple outcome measures require p-value adjustment? , 2002, BMC medical research methodology.

[88]  E. Hess,et al.  Pupil Size in Relation to Mental Activity during Simple Problem-Solving , 1964, Science.

[89]  Sébastien Miellet,et al.  Parafoveal Magnification , 2009, Psychological science.

[90]  E. Granholm,et al.  Pupillometric measures of cognitive and emotional processes. , 2004, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[91]  Andrew Gelman,et al.  Why We (Usually) Don't Have to Worry About Multiple Comparisons , 2009, 0907.2478.

[92]  Andreas Dengel,et al.  Query expansion using gaze-based feedback on the subdocument level , 2008, SIGIR '08.

[93]  John M Henderson,et al.  Stable individual differences across images in human saccadic eye movements. , 2008, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[94]  Yiannis Demiris,et al.  Using Visual Attention to Evaluate Collaborative Control Architectures for Human Robot Interaction , 2009, HRI 2009.

[95]  Ingrid S. Johnsrude,et al.  The eye as a window to the listening brain: Neural correlates of pupil size as a measure of cognitive listening load , 2014, NeuroImage.

[96]  T. Perneger What's wrong with Bonferroni adjustments , 1998, BMJ.

[97]  Jürgen Beyerer,et al.  Evaluating Multi-Modal Eye Gaze Interaction for Moving Object Selection , 2013, ACHI 2013.

[98]  F. Paas,et al.  Memory load and the cognitive pupillary response in aging. , 2004, Psychophysiology.

[99]  J. Morrison,et al.  DARPA Augmented Cognition Technical Integration Experiment (TIE) , 2003 .

[100]  J Debecker,et al.  Maximum capacity for sequential one-bit auditory decisions. , 1970, Journal of experimental psychology.

[101]  Terry Winograd,et al.  Gaze-enhanced scrolling techniques , 2007, UIST.

[102]  Mitsuru Ishizuka,et al.  Automatic Preference Detection by Analyzing the Gaze ‘ Cascade Effect , 2006 .

[103]  R. Caldara,et al.  Investigating cultural diversity for extrafoveal information use in visual scenes. , 2010, Journal of vision.

[104]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[105]  Jessie Y. C. Chen,et al.  The Mixed Initiative Experimental (MIX) Testbed for Human Robot Interactions With Varied Levels of Automation , 2008 .

[106]  J Hyönä,et al.  Pupil Dilation as a Measure of Processing Load in Simultaneous Interpretation and Other Language Tasks , 1995, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[107]  D. Saville Multiple Comparison Procedures: The Practical Solution , 1990 .

[108]  Hava T. Siegelmann,et al.  Human Strategies for Multitasking, Search, and Control Improved via Real-Time Memory Aid for Gaze Location , 2015, Front. ICT.

[109]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[110]  H M Simpson,et al.  Pupillary Changes during a Decision-Making Task , 1969, Perceptual and motor skills.

[111]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.