Sensing cognitive multitasking for a brain-based adaptive user interface

Multitasking has become an integral part of work environments, even though people are not well-equipped cognitively to handle numerous concurrent tasks effectively. Systems that support such multitasking may produce better performance and less frustration. However, without understanding the user's internal processes, it is difficult to determine optimal strategies for adapting interfaces, since all multitasking activity is not identical. We describe two experiments leading toward a system that detects cognitive multitasking processes and uses this information as input to an adaptive interface. Using functional near-infrared spectroscopy sensors, we differentiate four cognitive multitasking processes. These states cannot readily be distinguished using behavioral measures such as response time, accuracy, keystrokes or screen contents. We then present our human-robot system as a proof-of-concept that uses real-time cognitive state information as input and adapts in response. This prototype system serves as a platform to study interfaces that enable better task switching, interruption management, and multitasking.

[1]  James Tompkin,et al.  A novel brain-computer interface using a multi-touch surface , 2010, CHI.

[2]  Daniel C. McFarlane,et al.  Coordinating the Interruption of People in Human-Computer Interaction , 1999, INTERACT.

[3]  Brian P. Bailey,et al.  The Effects of Interruptions on Task Performance, Annoyance, and Anxiety in the User Interface , 2001, INTERACT.

[4]  Robert J. K. Jacob,et al.  Brain measurement for usability testing and adaptive interfaces: an example of uncovering syntactic workload with functional near infrared spectroscopy , 2009, CHI.

[5]  Brian P. Bailey,et al.  Investigating the effectiveness of mental workload as a predictor of opportune moments for interruption , 2005, CHI Extended Abstracts.

[6]  Robert J. K. Jacob,et al.  Using fNIRS brain sensing in realistic HCI settings: experiments and guidelines , 2009, UIST '09.

[7]  Desney S. Tan,et al.  Feasibility and pragmatics of classifying working memory load with an electroencephalograph , 2008, CHI.

[8]  Daniel C. McFarlane,et al.  Comparison of Four Primary Methods for Coordinating the Interruption of People in Human-Computer Interaction , 2002, Hum. Comput. Interact..

[9]  Christopher A. Monk,et al.  The Attentional Costs of Interrupting Task Performance at Various Stages , 2002 .

[10]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[11]  Matthias Scheutz,et al.  First steps toward natural human-like HRI , 2007, Auton. Robots.

[12]  Robert J. K. Jacob,et al.  Designing a passive brain computer interface using real time classification of functional near-infrared spectroscopy , 2013, Int. J. Auton. Adapt. Commun. Syst..

[13]  Dario D. Salvucci,et al.  Multitasking and monotasking: the effects of mental workload on deferred task interruptions , 2010, CHI.

[14]  Desney S. Tan,et al.  Using a low-cost electroencephalograph for task classification in HCI research , 2006, UIST.

[15]  E. Koechlin,et al.  Dissociating the role of the medial and lateral anterior prefrontal cortex in human planning. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors: a Wizard of Oz feasibility study , 2003, CHI '03.

[17]  Brian P. Bailey,et al.  Task-evoked pupillary response to mental workload in human-computer interaction , 2004, CHI EA '04.

[18]  David A. Boas,et al.  A temporal comparison of BOLD, ASL, and NIRS hemodynamic responses to motor stimuli in adult humans , 2006, NeuroImage.

[19]  Alex Pentland,et al.  Visual contextual awareness in wearable computing , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[20]  Brian P. Bailey,et al.  Towards an index of opportunity: understanding changes in mental workload during task execution , 2004, CHI.

[21]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[22]  Matthias Scheutz,et al.  Dynamic robot autonomy: investigating the effects of robot decision-making in a human-robot team task , 2009, ICMI-MLMI '09.

[23]  James Fogarty,et al.  Examining the robustness of sensor-based statistical models of human interruptibility , 2004, CHI.

[24]  E. Koechlin,et al.  The role of the anterior prefrontal cortex in human cognition , 1999, Nature.

[25]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  B. Chance,et al.  A novel method for fast imaging of brain function, non-invasively, with light. , 1998, Optics express.

[27]  Donald A. Norman,et al.  The control of multiple activities , 1986 .

[28]  Roel Vertegaal,et al.  Towards a Physiological Model of User Interruptability , 2007, INTERACT.

[29]  Anthony J. Hornof,et al.  Knowing where and when to look in a time-critical multimodal dual task , 2010, CHI.

[30]  Marko Turpeinen,et al.  The influence of implicit and explicit biofeedback in first-person shooter games , 2010, CHI.

[31]  Stephen H. Fairclough,et al.  Fundamentals of physiological computing , 2009, Interact. Comput..

[32]  Allen Newell,et al.  The psychology of human-computer interaction , 1983 .

[33]  Regan L. Mandryk,et al.  Physiological indicators for the evaluation of co-located collaborative play , 2004, CSCW.

[34]  Melody Moore Jackson,et al.  Applications for Brain-Computer Interfaces , 2010, Brain-Computer Interfaces.

[35]  D. Norman,et al.  Psychological Issues in Support of Multiple Activities , 1986 .

[36]  Mary Czerwinski,et al.  Instant Messaging: Effects of Relevance and Timing , 2000 .