Locating user attention using eye tracking and EEG for spatio-temporal event selection

In expert video analysis, the selection of certain events in a continuous video stream is a frequently occurring operation, e.g., in surveillance applications. Due to the dynamic and rich visual input, the constantly high attention and the required hand-eye coordination for mouse interaction, this is a very demanding and exhausting task. Hence, relevant events might be missed. We propose to use eye tracking and electroencephalography (EEG) as additional input modalities for event selection. From eye tracking, we derive the spatial location of a perceived event and from patterns in the EEG signal we derive its temporal location within the video stream. This reduces the amount of the required active user input in the selection process, and thus has the potential to reduce the user's workload. In this paper, we describe the employed methods for the localization processes and introduce the developed scenario in which we investigate the feasibility of this approach. Finally, we present and discuss results on the accuracy and the speed of the method and investigate how the modalities interact.

[1]  G. Pfurtscheller,et al.  A fully automated correction method of EOG artifacts in EEG recordings , 2007, Clinical Neurophysiology.

[2]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[3]  J. Timmer,et al.  Saccadic reaction times: a statistical analysis of multimodal distributions , 1997, Vision Research.

[4]  J. Wolpaw,et al.  Does the ‘P300’ speller depend on eye gaze? , 2010, Journal of neural engineering.

[5]  I. Scott MacKenzie An eye on input: research challenges in using the eye for computer input control , 2010, ETRA '10.

[6]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[7]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[8]  E. W. Sellers,et al.  Toward enhanced P300 speller performance , 2008, Journal of Neuroscience Methods.

[9]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[10]  P. Sajda,et al.  Cortically coupled computer vision for rapid image search , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[11]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[12]  Desney S. Tan,et al.  Human-aided computing: utilizing implicit human processing to classify images , 2008, CHI.

[13]  A. Damasio,et al.  Emotion, decision making and the orbitofrontal cortex. , 2000, Cerebral cortex.

[14]  Rabab K. Ward,et al.  The Design of a Point-and-Click System by Integrating a Self-Paced Brain–Computer Interface With an Eye-Tracker , 2011, IEEE Journal on Emerging and Selected Topics in Circuits and Systems.

[15]  Cuntai Guan,et al.  Estimation of glance from EEG for cursor control , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[16]  Stefan Haufe,et al.  Single-trial analysis and classification of ERP components — A tutorial , 2011, NeuroImage.

[17]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[18]  Chih-Jen Lin,et al.  A Practical Guide to Support Vector Classication , 2008 .

[19]  S. Venkataramanan,et al.  Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system , 2005, Proceedings of 2005 International Conference on Intelligent Sensing and Information Processing, 2005..

[20]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[21]  Thorsten O. Zander,et al.  Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction , 2010, Int. J. Hum. Comput. Interact..

[22]  T. Sejnowski,et al.  Removing electroencephalographic artifacts by blind source separation. , 2000, Psychophysiology.