Human Strategies for Multitasking, Search, and Control Improved via Real-Time Memory Aid for Gaze Location

OBJECTIVE: We aimed to elucidate how our domain-general cuing algorithm improved multitasking performance and changed behavioral strategies in human operators. BACKGROUND: Though many gaze-control systems have been designed, previous real-time gaze-aware assistance systems were not both successful and domain general. It is largely unknown what constitutes optimal search efficiency using the eyes, or ideal control using the mouse. It is unclear what the best coordinating strategies are between these two modalities. Our previously developed closed-loop multitasking aid drastically improved multitasking performance, though the behavioral mechanisms through which it acted were unknown. METHODS: We performed in-depth analyses and generated novel eye tracking and mouse movement measures, to explore the complex effects of our helpful system on gaze and motor behavior. RESULTS: Our overlay cuing algorithm improved control efficiency and reduced well-known biases in search patterns. This system also reduced micromanaging behavior, with humans rationally relying more on imperfect automation in experimental assistance cue conditions. We showed that mouse and gaze were more independently specialized in the helpful cuing condition than in control conditions. Specifically, with our aid, the gaze performed more global movement, and the mouse performed more local clustered movement. Further, the gaze shifted towards search over processing with the helpful cuing system. We also illustrated a relationship between the mouse and the gaze, such that in these studies, "the hand was quicker than the eye." CONCLUSIONS: Overall, results suggested that our cuing system improved performance and reduced short term working memory load on humans by delegating it to the computer in real time. Further, it reduced the number of required repeated decisions by an estimate of about one per second. It also enabled the gaze to specialize for improved visual search behavior, and the mouse to specialize for improved control.

[1]  Amedeo Cesta,et al.  Psychophysiological Methods to Evaluate User's Response in Human Robot Interaction: A Review and Feasibility Study , 2013, Robotics.

[2]  M. Goodrich,et al.  Enabling Human Interaction with Bio-Inspired Robot Teams : Topologies , Leaders , Predators , and Stakeholders BYU-HCMI Technical Report 2011-1 , 2011 .

[3]  E. R. F. W. Grossman,et al.  The Information-Capacity of the Human Motor-System in Pursuit Tracking , 1960 .

[4]  Mitsuru Ishizuka,et al.  Automatic Preference Detection by Analyzing the Gaze ‘ Cascade Effect , 2006 .

[5]  R. Parasuraman Memory load and event rate control sensitivity decrements in sustained attention. , 1979, Science.

[6]  David B. Kaber,et al.  The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task , 2004 .

[7]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[8]  Francesco Giovacchini,et al.  Measuring human-robot interaction on wearable robots: A distributed approach , 2011 .

[9]  J. J. Persensky,et al.  New century, new trends : proceedings of the 2002 IEEE 7th Conference on Human Factors and Power Plants, September 15 n 19, 2002, Scottsdale, Arizona , 2002 .

[10]  D. Navon Resources—a theoretical soup stone? , 1984 .

[11]  D L Fisher,et al.  Visual Displays: The Highlighting Paradox , 1989, Human factors.

[12]  Claudio de’Sperati,et al.  The Inner Working of Dynamic Visuo-Spatial Imagery as Revealed by Spontaneous Eye Movements , 2003 .

[13]  Cristina Conati,et al.  Eye-tracking to model and adapt to user meta-cognition in intelligent learning environments , 2006, IUI '06.

[14]  Iris Vessey,et al.  Expertise in Debugging Computer Programs: A Process Analysis , 1984, Int. J. Man Mach. Stud..

[15]  Frank L. Greitzer,et al.  Extending the Reach of Augmented Cognition To Real-World Decision Making Tasks , 2005 .

[16]  Päivi Majaranta,et al.  Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies , 2011 .

[17]  Sher ry Folsom-Meek,et al.  Human Performance , 2020, Nature.

[18]  Greg Mori,et al.  HRI in the sky: Creating and commanding teams of UAVs with a vision-mediated gestural interface , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Joseph T. Coyne,et al.  Applying Real Time Physiological Measures of Cognitive Load to Improve Training , 2009, HCI.

[20]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[21]  Gerhard Tröster,et al.  What's in the Eyes for Context-Awareness? , 2011, IEEE Pervasive Computing.

[22]  Juan Pablo Wachs,et al.  Visual Analysis and Filtering to Augment Cognition , 2013, HCI.

[23]  G. McConkie,et al.  The span of the effective stimulus during a fixation in reading , 1975 .

[24]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[25]  Alexander Zelinsky,et al.  Active gaze tracking for human-robot interaction , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[26]  Hao Jiang,et al.  Personalized online document, image and video recommendation via commodity eye-tracking , 2008, RecSys '08.

[27]  F. Gobet Expert memory: a comparison of four theories , 1998, Cognition.

[28]  Misha Pavel,et al.  Augmented cognition: allocation of attention , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[29]  M. Crocker,et al.  The utility of gaze in spoken human-robot interaction , 2008 .

[30]  David Kirsh,et al.  Immediate Interactive Behavior — How Embodied and Embedded Cognition Uses and Changes the World to Achieve its Goals , 2007 .

[31]  C. Eriksen,et al.  Temporal course of selective attention. , 1969, Journal of experimental psychology.

[32]  K S Seidler,et al.  Information access in a dual-task context: testing a model of optimal strategy selection. , 1997, Journal of experimental psychology. Applied.

[33]  Katia Sycara,et al.  Human-swarm interaction , 2013, HRI 2013.

[34]  Richard A. Bolt,et al.  A gaze-responsive self-disclosing display , 1990, CHI '90.

[35]  J ClarkJames,et al.  Video game design using an eye-movement-dependent model of visual attention , 2008 .

[36]  Ravi S. Adapathya,et al.  Strategic Behavior, Workload, and Performance in Task Scheduling , 1991 .

[37]  D. Strayer,et al.  Supertaskers: Profiles in extraordinary multitasking ability , 2010, Psychonomic bulletin & review.

[38]  B. Velichkovsky,et al.  Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration , 2005 .

[39]  I. B. Ushakov,et al.  Symposium Chair: V. Sysoev (Russia); Co-Chairs: D. Schmorrow (USA) and S. Lytaev (Russia)Psychophysiological approaches to the research and restoration of mental health of military in extreme conditions , 2008 .

[40]  J M Hammer HUMAN FACTORS OF FUNCTIONALITY AND INTELLIGENT AVIONICS. IN: HANDBOOK OF AVIATION HUMAN FACTORS , 1999 .

[41]  R. C. Langford How People Look at Pictures, A Study of the Psychology of Perception in Art. , 1936 .

[42]  S. Monsell,et al.  Costs of a predictible switch between simple cognitive tasks. , 1995 .

[43]  Jennifer J. Vogel-Walcutt,et al.  Augmented Cognition and Training in the Laboratory: DVTE System Validation , 2008 .

[44]  Hava T. Siegelmann,et al.  EyeFrame: Real-Time Memory Aid Improves Human Multitasking via Domain-General Eye Tracking Procedures , 2015, Front. ICT.

[45]  Greg Mori,et al.  Selecting and Commanding Individual Robots in a Multi-Robot System , 2010, 2010 Canadian Conference on Computer and Robot Vision.

[46]  Christopher D. Wickens,et al.  Computational Models of Human Performance in the Design and Layout of Controls , 1997 .

[47]  Dario D. Salvucci Inferring intent in eye-based interfaces: tracing eye movements with process models , 1999, CHI '99.

[48]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[49]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[50]  Colby Raley,et al.  Platform-Based Design of Augmented Cognition Systems , 2004 .

[51]  Renwick E. Curry,et al.  Flight-deck automation: promises and problems , 1980 .

[52]  I V Laudeman,et al.  Quantitative measurement of observed workload in the analysis of aircrew performance. , 1995, The International journal of aviation psychology.

[53]  Oleg Spakov,et al.  Gaze-based selection of standard-size menu items , 2005, ICMI '05.

[54]  Dylan D. Schmorrow,et al.  Enhancing Mitigation in Augmented Cognition , 2007 .

[55]  Aulikki Hyrskykari,et al.  Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid , 2006 .

[56]  Peter Ford Dominey,et al.  I Reach Faster When I See You Look: Gaze Effects in Human–Human and Human–Robot Face-to-Face Cooperation , 2012, Front. Neurorobot..

[57]  Luca Maria Gambardella,et al.  Communication assisted navigation in robotic swarms: Self-organization and cooperation , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[58]  Ken Funk,et al.  A Functional Model of Flightdeck Agenda Management , 1996 .

[59]  Giulio Sandini,et al.  Measuring Human-Robot Interaction Through Motor Resonance , 2012, Int. J. Soc. Robotics.

[60]  D. Alan Allport,et al.  SHIFTING INTENTIONAL SET - EXPLORING THE DYNAMIC CONTROL OF TASKS , 1994 .

[61]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[62]  Santosh Mathan,et al.  Automation Etiquette in the Augmented Cognition Context , 2005 .

[63]  Ching-Jen Chen,et al.  Flow visualization of bubble collapse flow , 2007, J. Vis..

[64]  Lyuba Alboul,et al.  A Robot Swarm Assisting a Human Fire-Fighter , 2009, Adv. Robotics.

[65]  Gavan Lintern,et al.  Display Principles, Control Dynamics, and Environmental Factors in Pilot Training and Transfer , 1990 .

[66]  Deb Roy,et al.  Visual memory augmentation: using eye gaze as an attention filter , 2004, Eighth International Symposium on Wearable Computers.

[67]  N. Moray,et al.  A simulation study of human performance deterioration and mental workload , 1993 .

[68]  Edward H. Shortliffe,et al.  Medical consultation systems: Designing for doctors , 1983 .

[69]  Minho Lee,et al.  Probing of human implicit intent based on eye movement and pupillary analysis for augmented cognition , 2013, Int. J. Imaging Syst. Technol..

[70]  Ze He,et al.  How do Interruptions Impact Nurses' Visual Scanning Patterns When Using Barcode Medication Administration Systems? , 2014, AMIA.

[71]  J K JacobRobert,et al.  The use of eye movements in human-computer interaction techniques , 1991 .

[72]  Andreas Dengel,et al.  Attention-Based Document Classifier Learning , 2008, 2008 The Eighth IAPR International Workshop on Document Analysis Systems.

[73]  Jennifer J. Vogel-Walcutt,et al.  A review of eye-tracking applications as tools for training , 2012, Cognition, Technology & Work.

[74]  Jun Takamatsu,et al.  A gesture-centric Android system for multi-party human-robot interaction , 2013, HRI 2013.

[75]  Herbert A. Simon,et al.  THE MIND'S EYE IN CHESS , 1988 .

[76]  Qi Tian,et al.  Content-adaptive digital music watermarking based on music structure analysis , 2007, TOMCCAP.

[77]  Fumihiko Ishida,et al.  Human hand moves proactively to the external stimulus: an evolutional strategy for minimizing transient error. , 2004, Physical review letters.

[78]  Jerome L. Myers,et al.  Research Design and Statistical Analysis: Third Edition , 1991 .

[79]  S. Shimojo,et al.  Gaze bias both reflects and influences preference , 2003, Nature Neuroscience.

[80]  Jerrold M. Levine,et al.  Measurement of Workload by Secondary Tasks , 1979 .

[81]  H. Simon,et al.  The mind's eye in chess. , 1973 .

[82]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.

[83]  Cheng Zhang,et al.  An Eye-Gaze Tracking and Human Computer Interface System for People with ALS and other Locked-in Diseases , 2012 .

[84]  John Brown Some Tests of the Decay Theory of Immediate Memory , 1958 .

[85]  Hussein A. Abbass,et al.  Augmented Cognition using Real-time EEG-based Adaptive Strategies for Air Traffic Control , 2014 .

[86]  Michael A. Goodrich,et al.  On Leadership and Influence in Human-Swarm Interaction , 2012, AAAI Fall Symposium: Human Control of Bioinspired Swarms.

[87]  James L. Smith,et al.  Effect of Music on Performance in Human-Computer Interface , 1987 .

[88]  Tanja Schultz,et al.  Adaptive cognitive technical systems , 2014, Journal of Neuroscience Methods.

[89]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[90]  R. Pfeifer,et al.  Self-Organization, Embodiment, and Biologically Inspired Robotics , 2007, Science.

[91]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[92]  Michael C. Anderson,et al.  Interference and inhibition in memory retrieval. , 1996 .

[93]  Bart Wyns,et al.  Low cost eye tracking for human-machine interfacing , 2010 .

[94]  Sandra Hart,et al.  Evaluation of the Potential Format and Content of a Cockpit Display of Traffic Information , 1980 .

[95]  John R. Anderson,et al.  Working Memory: Activation Limitations on Retrieval , 1996, Cognitive Psychology.

[96]  K J Rothman,et al.  No Adjustments Are Needed for Multiple Comparisons , 1990, Epidemiology.

[97]  Sven Laqua,et al.  GazeSpace: eye gaze controlled content spaces , 2007 .

[98]  Takehiko Ohno EyePrint: support of document browsing with eye gaze trace , 2004, ICMI '04.

[99]  B. Velichkovsky Heterarchy of cognition: The depths and the highs of a framework for memory research , 2002, Memory.

[100]  S. P. Marshall,et al.  The Index of Cognitive Activity: measuring cognitive workload , 2002, Proceedings of the IEEE 7th Conference on Human Factors and Power Plants.

[101]  Roel Vertegaal,et al.  EyeWindows : Using Eye-Controlled Zooming Windows for Focus Selection , 2004 .

[102]  Sébastien Miellet,et al.  Parafoveal Magnification , 2009, Psychological science.

[103]  Michael S. Wogalter,et al.  Reading comprehension in the presence of unattended speech and music , 1988 .

[104]  Joseph H. Goldberg,et al.  Computer interface evaluation using eye movements: methods and constructs , 1999 .

[105]  R.J.K. Jacob,et al.  Hot topics-eye-gaze computer interfaces: what you look at is what you get , 1993, Computer.

[106]  Leland S Kollmorgen A case for operational approach in advanced research projects-the augmented cognition story. , 2007, Aviation, space, and environmental medicine.

[107]  G. Hole,et al.  Decay and Interference Effects in Visuospatial Short-Term Memory , 1996, Perception.

[108]  Daniel J Weintraub,et al.  Human factors issues in head-up display design : the book of HUD , 1992 .

[109]  Jessie Y. C. Chen,et al.  Human-Robot Teams Collaborating Socially, Organizationally, and Culturally , 2011 .

[110]  J. I. Elkind,et al.  Transmission of Information in Simple Manual Control Systems , 1961 .

[111]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[112]  Boris M. Velichkovsky,et al.  Visual fixations and level of attentional processing , 2000, ETRA.

[113]  Don N. Kleinmuntz,et al.  Information Displays and Choice Processes: Differential Effects of Organization, Form, and Sequence , 1994 .

[114]  Richard L. Newman,et al.  Head-Up Displays: Designing the Way Ahead , 1995 .

[115]  N Moray,et al.  Fault management in process control: eye movements and action. , 1989, Ergonomics.

[116]  张静,et al.  Banana Ovate family protein MaOFP1 and MADS-box protein MuMADS1 antagonistically regulated banana fruit ripening , 2015 .

[117]  J. A. McGeoch Studies in retroactive inhibition: VII. Retroactive inhibition as a function of the length and frequency of presentation of the interpolated lists. , 1936 .

[118]  Heiko Drewes,et al.  Eye gaze tracking for human computer interaction , 2010 .

[119]  Michael A. Goodrich,et al.  Toward human interaction with bio-inspired robot teams , 2011, 2011 IEEE International Conference on Systems, Man, and Cybernetics.

[120]  Andreas Paepcke,et al.  Gaze-enhanced scrolling techniques , 2007, CHI Extended Abstracts.

[121]  M. Bazerman Judgment in Managerial Decision Making , 1990 .

[122]  Gavan Lintern,et al.  Display principles, control dynamics, and environmental factors in pilot training and transfer : Human Factors, 1990, 32.3, 299–317, 29 refs , 1991 .

[123]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[124]  Thomas B. Sheridan,et al.  On How Often the Supervisor Should Sample , 1970, IEEE Trans. Syst. Sci. Cybern..

[125]  D. Strayer,et al.  Who Multi-Tasks and Why? Multi-Tasking Ability, Perceived Multi-Tasking Ability, Impulsivity, and Sensation Seeking , 2013, PloS one.

[126]  Andrew Gelman,et al.  Why We (Usually) Don't Have to Worry About Multiple Comparisons , 2009, 0907.2478.

[127]  Yung-Ching Liu,et al.  Effects of Different Blood Alcohol Concentrations and Post-Alcohol Impairment on Driving Behavior and Task Performance , 2010, Traffic injury prevention.

[128]  Andreas Dengel,et al.  Query expansion using gaze-based feedback on the subdocument level , 2008, SIGIR '08.

[129]  Cihan H. Dagli,et al.  Augmented Cognition in Human-System Interaction through Coupled Action of Body Sensor Network and Agent Based Modeling , 2013, CSER.

[130]  John M Henderson,et al.  Stable individual differences across images in human saccadic eye movements. , 2008, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[131]  E D Megaw,et al.  Target Uncertainty and Visual Scanning Strategies , 1979, Human factors.

[132]  Stefan Kohlbecher,et al.  Studying Gaze-based Human Robot Interaction: An Experimental Platform , 2012, HRI 2012.

[133]  Yiannis Demiris,et al.  Using Visual Attention to Evaluate Collaborative Control Architectures for Human Robot Interaction , 2009, HRI 2009.

[134]  C Trujillo Anna,et al.  Flight Crew Task Management in Non-Normal Situations , 1996 .

[135]  Luca Maria Gambardella,et al.  Incremental learning using partial feedback for gesture-based human-swarm interaction , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[136]  Sandra P. Marshall,et al.  Integrating psychophysiological measures of cognitive workload and eye movements to detect strategy shifts , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[137]  Min-Shik Kim,et al.  The role of spatial working memory in visual search efficiency , 2004, Psychonomic bulletin & review.

[138]  Luca Maria Gambardella,et al.  Human-swarm interaction through distributed cooperative gesture recognition , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[139]  Gavriel Salvendy,et al.  Quantitative and qualitative differences between experts and novices in chunking computer software knowledge , 1994, Int. J. Hum. Comput. Interact..

[140]  Jeff B. Pelz,et al.  Building a lightweight eyetracking headgear , 2004, ETRA.

[141]  Veronica Sundstedt,et al.  Gazing at Games: An Introduction to Eye Tracking Control , 2012, Gazing at Games: An Introduction to Eye Tracking Control.

[142]  Dylan D Schmorrow,et al.  21st century human-system computing: augmented cognition for improved human performance. , 2007, Aviation, space, and environmental medicine.

[143]  Christopher A. Miller From Associate Systems to Augmented Cognition : 25 Years of User Adaptation in High Criticality Systems , 2012 .

[144]  Jerome L. Myers,et al.  Research Design and Statistical Analysis , 1991 .

[145]  Santosh Mathan,et al.  Building Honeywell's Adaptive System for the Augmented Cognition Program , 2005 .

[146]  Michelle Yeh,et al.  Conformality and Target Cueing: Presentation of Symbology in Augmented Reality , 1998 .

[147]  R. Parasuraman Vigilance, monitoring, and search. , 1986 .

[148]  Anthony Lambert,et al.  The influence of a salience distinction between bilateral cues on the latency of target-detection saccades. , 2003, British journal of psychology.

[149]  W. Hirst,et al.  Characterizing attentional resources. , 1987, Journal of experimental psychology. General.

[150]  Keith S. Karn,et al.  Commentary on Section 4. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. , 2003 .

[151]  Eliseo Ferrante,et al.  Swarm robotics: a review from the swarm engineering perspective , 2013, Swarm Intelligence.

[152]  Richard K. Lowe,et al.  An Eye Tracking Comparison of External Pointing Cues and Internal Continuous Cues in Learning with Complex Animations , 2010 .

[153]  Terry C. Lansdown,et al.  The mind's eye: cognitive and applied aspects of eye movement research , 2005 .

[154]  L. R. Peterson,et al.  Short-term retention of individual verbal items. , 1959, Journal of experimental psychology.

[155]  Erol Özçelik,et al.  Why does signaling enhance multimedia learning? Evidence from eye movements , 2010, Comput. Hum. Behav..

[156]  Robert C. Williges,et al.  A Regression Approach to Generate Aircraft Predictor Information , 1977 .

[157]  David C. Nagel,et al.  Human factors in aviation , 1988 .

[158]  P. Downing,et al.  Interactions Between Visual Working Memory and Selective Attention , 2000, Psychological science.

[159]  Jianping Fan,et al.  Incorporating feature hierarchy and boosting to achieve more effective classifier training and concept-oriented video summarization and skimming , 2008, TOMCCAP.

[160]  Michael A. Goodrich,et al.  Human control of bioinspired swarms : papers from the AAAI Fall Symposium , 2012 .

[161]  Estela Bicho,et al.  Neuro-cognitive mechanisms of decision making in joint action: a human-robot interaction study. , 2011, Human movement science.

[162]  Ali Marjovi,et al.  Guardians Robot Swarm Exploration and Firefighter Assistance , 2009 .

[163]  Sven Laqua,et al.  GazeSpace: eye gaze controlled content spaces , 2007, BCS HCI.

[164]  Tjerk de Greef,et al.  Augmenting Cognition: Reviewing the Symbiotic Relation Between Man and Machine , 2007, HCI.

[165]  Jan Noyes,et al.  Solving problems: How can guidance concerning task-relevancy be provided? , 2010, Comput. Hum. Behav..

[166]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[167]  Sven Fuchs,et al.  Augmented Cognition can increase human performance in the control room , 2007, 2007 IEEE 8th Human Factors and Power Plants and HPRCT 13th Annual Meeting.

[168]  Päivi Majaranta,et al.  Proactive Response to Eye Movements , 2003, INTERACT.

[169]  K Rayner,et al.  Reading without a fovea. , 1979, Science.

[170]  Andrew T. Duchowski,et al.  Gaze-Contingent Displays: A Review , 2004, Cyberpsychology Behav. Soc. Netw..

[171]  C. Wickens,et al.  The Sternberg memory search task as an index of pilot workload , 1986 .

[172]  M R Endsley,et al.  Level of automation effects on performance, situation awareness and workload in a dynamic control task. , 1999, Ergonomics.

[173]  A. W. Melton Implications of short-term memory for a general theory of memory , 1963 .

[174]  Maria L. Thomas,et al.  Monitoring and predicting cognitive state and performance via physiological correlates of neuronal signals. , 2005, Aviation, space, and environmental medicine.

[175]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[176]  Neville Moray Human Information Processing and Supervisory Control. , 1980 .

[177]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[178]  R H Logie,et al.  Visuospatial working memory, movement control and executive demands. , 1995, British journal of psychology.

[179]  Matthias Scheutz,et al.  Functional near-infrared spectroscopy in human-robot interaction , 2013, HRI 2013.

[180]  D H Ballard,et al.  Hand-eye coordination during sequential tasks. , 1992, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[181]  David A. Kobus,et al.  Overview of the DARPA Augmented Cognition Technical Integration Experiment , 2004, Int. J. Hum. Comput. Interact..

[182]  R. Feise Do multiple outcome measures require p-value adjustment? , 2002, BMC medical research methodology.

[183]  Tian Lan,et al.  Cognitive State Estimation Based on EEG for Augmented Cognition , 2005, Conference Proceedings. 2nd International IEEE EMBS Conference on Neural Engineering, 2005..

[184]  J. Morrison,et al.  DARPA Augmented Cognition Technical Integration Experiment (TIE) , 2003 .

[185]  W Barfield,et al.  Skilled Performance on Software as a Function of Domain Expertise and Program Organization , 1997, Perceptual and motor skills.

[186]  Thomas B. Sheridan,et al.  Monitoring Behavior and Supervisory Control , 1976 .

[187]  M. Crocker,et al.  Investigating joint attention mechanisms through spoken human–robot interaction , 2011, Cognition.

[188]  Andry Tanoto,et al.  Analysis and design of human-robot swarm interaction in firefighting , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[189]  J Debecker,et al.  Maximum capacity for sequential one-bit auditory decisions. , 1970, Journal of experimental psychology.

[190]  Anne B. Sereno,et al.  Voluntary Spatial Attention has Different Effects on Voluntary and Reflexive Saccades , 2003, TheScientificWorldJournal.

[191]  Terry Winograd,et al.  Gaze-enhanced scrolling techniques , 2007, UIST.

[192]  D. Brown,et al.  Supporting human interaction with robust robot swarms , 2012, 2012 5th International Symposium on Resilient Control Systems.

[193]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[194]  Martha E. Crosby,et al.  Research on task complexity as a foundation for augmented cognition , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[195]  A. D. D. Groot Thought and Choice in Chess , 1978 .

[196]  Thomas B. Sheridan,et al.  Dynamic Decisions and Work Load in Multitask Supervisory Control , 1980, IEEE Transactions on Systems, Man, and Cybernetics.

[197]  Robin R. Murphy,et al.  Survey of metrics for human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[198]  Arthur J. Grunwald,et al.  Predictor laws for pictorial flight displays , 1985 .

[199]  K. J. Craik Theory of the human operator in control systems; man as an element in a control system. , 1948, British Journal of Psychology General Section.

[200]  Jakob Nielsen,et al.  Eyetracking Web Usability , 2009 .

[201]  B. Underwood,et al.  Proactive inhibition in short-term retention of single items , 1962 .

[202]  Keith S. Jones,et al.  Human-Robot Interaction Toward Usable Personal Service Robots , 2011 .

[203]  Akinori Sasaki,et al.  Cooperative interaction of walking human and distributed robot maintaining stability of swarm , 2009, 2009 2nd Conference on Human System Interactions.

[204]  Neville Moray The Role of Attention in the Detection of Errors and the Diagnosis of Failures in Man-Machine Systems , 1981 .

[205]  Roy Stripling,et al.  Augmented Cognition Overview: Improving Information Intake under Stress , 2004 .

[206]  C. E. Ferree,et al.  The use of the illumination scale for the detection of small errors in refraction and in their correction , 1920 .

[207]  Christopher D. Wickens,et al.  Strategic workload management and decision biases in aviation , 1994 .

[208]  Tibor Bosse,et al.  Augmented Metacognition Addressing Dynamic Allocation of Tasks Requiring Visual Attention , 2007, HCI.

[209]  Joseph H. Goldberg,et al.  Eye-Gaze Control of the Computer Interface: Discrimination of Zoom Intent , 1993 .

[210]  D. Navon,et al.  Role of outcome conflict in dual-task interference. , 1987, Journal of experimental psychology. Human perception and performance.

[211]  Dylan D. Schmorrow,et al.  Augmented Cognition: An Overview , 2009 .

[212]  Eric R. Stone,et al.  Effects of numerical and graphical displays on professed risk-taking behavior. , 1997 .

[213]  G. Woodman,et al.  Visual search is slowed when visuospatial working memory is occupied , 2004, Psychonomic bulletin & review.

[214]  Takehiko Ohno,et al.  EyePrint: Using Passive Eye Trace From Reading to Enhance Document Access and Comprehension , 2007, Int. J. Hum. Comput. Interact..

[215]  D. Fisher,et al.  Nurses' behaviors and visual scanning patterns may reduce patient identification errors. , 2011, Journal of experimental psychology. Applied.

[216]  Yang Yang,et al.  Human-Robot Interaction in UVs Swarming: A Survey , 2013 .

[217]  Joseph H. Goldberg,et al.  Eye-gaze-contingent control of the computer interface: Methodology and example for zoom detection , 1995 .

[218]  C. Wickens,et al.  Visual scanning and pilot expertise: the role of attentional flexibility and mental model development. , 1997, Aviation, space, and environmental medicine.

[219]  D B YNTEMA,et al.  Keeping Track of Several Things at Once1 , 1963, Human factors.

[220]  A K Pradhan,et al.  The view from the road: the contribution of on-road glance-monitoring technologies to understanding driver behavior. , 2013, Accident; analysis and prevention.

[221]  Michael Lewis,et al.  Towards human control of robot swarms , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[222]  Haruki Ueno,et al.  On Tracking of Eye for Human-Robot Interface , 2004, Int. J. Robotics Autom..

[223]  Matthew W. Crocker,et al.  Visual attention in spoken human-robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[224]  R. Caldara,et al.  Investigating cultural diversity for extrafoveal information use in visual scenes. , 2010, Journal of vision.

[225]  Denise Nicholson,et al.  Augmented Cognition Technologies Applied to Training: A Roadmap for the Future , 2005 .

[226]  Robert J. K. Jacob,et al.  Eye tracking in human-computer interaction and usability research : Ready to deliver the promises , 2002 .

[227]  Marc S. Lavine,et al.  A Robotic Future , 2007, Science.

[228]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[229]  Jessie Y. C. Chen,et al.  The Mixed Initiative Experimental (MIX) Testbed for Human Robot Interactions With Varied Levels of Automation , 2008 .

[230]  D. Saville Multiple Comparison Procedures: The Practical Solution , 1990 .

[231]  Luca Maria Gambardella,et al.  Distributed consensus for interaction between humans and mobile robot swarms (demonstration) , 2012, AAMAS.

[232]  Stephen M Fiore,et al.  Augmenting team cognition in human-automation teams performing in complex operational environments. , 2007, Aviation, space, and environmental medicine.

[233]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[234]  Toshiyuki Inagaki,et al.  Adaptive Automation: Sharing and Trading of Control , 2001 .

[235]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[236]  Neville Moray,et al.  Monitoring behavior and supervisory control , 1986 .

[237]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[238]  Raja Parasuraman,et al.  Effects of Task Demands and Age on Vigilance and Subjective Workload , 1988 .

[239]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[240]  Lester C. Loschky,et al.  Gaze-Contingent Multiresolutional Displays: An Integrative Review , 2003, Hum. Factors.

[241]  T. Perneger What's wrong with Bonferroni adjustments , 1998, BMJ.

[242]  Jürgen Beyerer,et al.  Evaluating Multi-Modal Eye Gaze Interaction for Moving Object Selection , 2013, ACHI 2013.

[243]  M Donk Human monitoring behavior in a multiple-instrument setting: independent sampling, sequential sampling or arrangement-dependent sampling. , 1994, Acta psychologica.

[244]  Silvia Conforto,et al.  A neural-based remote eye gaze tracker under natural head motion , 2008, Comput. Methods Programs Biomed..

[245]  Michael A. Peshkin,et al.  Mental Transformations in Human-Robot Interaction , 2011 .

[246]  D. Pearson,et al.  The Inner Eye and the Inner Scribe of Visuo-spatial Working Memory: Evidence from Developmental Fractionation , 1997 .

[247]  R. Dawes,et al.  Heuristics and Biases: Clinical versus Actuarial Judgment , 2002 .

[248]  S. Reder On-line monitoring of eye-position signals in contingent and noncontingent paradigms , 1973 .

[249]  Deborah McCutchen,et al.  Individual Differences in Writing: Implications of Translating Fluency. , 1994 .

[250]  Robert W. Bailey,et al.  Human performance engineering: using human factors/ergonomics to achieve computer system usability (2nd ed.) , 1989 .

[251]  Lester C. Loschky,et al.  How late can you update gaze-contingent multiresolutional displays without detection? , 2007, TOMCCAP.

[252]  J. Gregory Trafton,et al.  ACT-R/E , 2013, HRI 2013.

[253]  James J. Clark,et al.  Microsaccades as an overt measure of covert attention shifts , 2002, Vision Research.

[254]  Michael A. Goodrich,et al.  What Types of Interactions do Bio-Inspired Robot Swarms and Flocks Afford a Human? , 2012, Robotics: Science and Systems.

[255]  Robert M. Taylor,et al.  From Safety Net to Augmented Cognition: Using Flexible Autonomy Levels for On-Line Cognitive Assistance and Automation , 2003 .

[256]  Mark Chignell,et al.  Strategic Issues in Knowledge Engineering , 1988 .

[257]  A. Jersild Mental set and shift , 2011 .

[258]  Andrew Heathcote,et al.  Multi-tasking in Working Memory , 2014 .

[259]  C R Kelley,et al.  Manual and Automatic Control , 1968 .

[260]  Michael C. Dorneich,et al.  Mitigating cognitive bottlenecks via an augmented cognition adaptive system , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[261]  Brian P. Bailey,et al.  Using Eye Gaze Patterns to Identify User Tasks , 2004 .