Quantitative Evaluation of a Low-Cost Noninvasive Hybrid Interface Based on EEG and Eye Movement

This paper describes a low-cost noninvasive brain-computer interface (BCI) hybridized with eye tracking. It also discusses its feasibility through a Fitts' law-based quantitative evaluation method. Noninvasive BCI has recently received a lot of attention. To bring the BCI applications into real life, user-friendly and easily portable devices need to be provided. In this work, as an approach to realize a real-world BCI, electroencephalograph (EEG)-based BCI combined with eye tracking is investigated. The two interfaces can be complementary to attain improved performance. Especially to consider public availability, a low-cost interface device is intentionally used for test. A low-cost commercial EEG recording device is integrated with an inexpensive custom-built eye tracker. The developed hybrid interface is evaluated through target pointing and selection experiments. Eye movement is interpreted as cursor movement and noninvasive BCI selects a cursor point with two selection confirmation schemes. Using Fitts' law, the proposed interface scheme is compared with other interface schemes such as mouse, eye tracking with dwell time, and eye tracking with keyboard. In addition, the proposed hybrid BCI system is discussed with respect to a practical interface scheme. Although further advancement is required, the proposed hybrid BCI system has the potential to be practically useful in a natural and intuitive manner.

[1]  G Pfurtscheller,et al.  Current trends in Graz Brain-Computer Interface (BCI) research. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[2]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[3]  Silvestro Micera,et al.  Development and Quantitative Performance Evaluation of a Noninvasive EMG Computer Interface , 2009, IEEE Transactions on Biomedical Engineering.

[4]  F Cincotti,et al.  Current trends in hardware and software for brain–computer interfaces (BCIs) , 2011, Journal of neural engineering.

[5]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.

[6]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..

[7]  Kee-Eung Kim,et al.  A POMDP approach to P300-based brain-computer interfaces , 2010, IUI '10.

[8]  John Paulin Hansen,et al.  Low-cost gaze interaction: ready to deliver the promises , 2009, CHI Extended Abstracts.

[9]  G R Müller-Putz,et al.  Toward smarter BCIs: extending BCIs through hybridization and intelligent control , 2012, Journal of neural engineering.

[10]  Jonathan R Wolpaw,et al.  Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[11]  Jaeseung Jeong,et al.  Noninvasive Brain-Computer Interface-based control of humanoid navigation , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  A. Frolov,et al.  Brain-Computer Interface Based on Generation of Visual Images , 2011, PloS one.

[13]  Feng Wan,et al.  Implementation of SSVEP based BCI with Emotiv EPOC , 2012, 2012 IEEE International Conference on Virtual Environments Human-Computer Interfaces and Measurement Systems (VECIMS) Proceedings.

[14]  Reinhold Scherer,et al.  Steady-state visual evoked potential (SSVEP)-based communication: impact of harmonic frequency components , 2005, Journal of neural engineering.

[15]  I. Scott MacKenzie,et al.  An error model for pointing based on Fitts' law , 2008, CHI.

[16]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.

[17]  Thorsten O. Zander,et al.  Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction , 2010, Int. J. Hum. Comput. Interact..

[18]  Jonathan R Wolpaw,et al.  Sensorimotor rhythm-based brain–computer interface (BCI): model order selection for autoregressive spectral analysis , 2008, Journal of neural engineering.

[19]  Bin He,et al.  Goal selection versus process control in a brain–computer interface based on sensorimotor rhythms , 2009, Journal of neural engineering.

[20]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[21]  Jaeseung Jeong,et al.  Toward Brain-Actuated Humanoid Robots: Asynchronous Direct Control Using an EEG-Based BCI , 2012, IEEE Transactions on Robotics.

[22]  Clemens Brunner,et al.  Mu rhythm (de)synchronization and EEG single-trial classification of different motor imagery tasks , 2006, NeuroImage.

[23]  Cuntai Guan,et al.  Regularizing Common Spatial Patterns to Improve BCI Designs: Unified Theory and New Algorithms , 2011, IEEE Transactions on Biomedical Engineering.

[24]  David Fitzpatrick,et al.  Neuroscience, 3rd ed. , 2004 .

[25]  Ricardo Chavarriaga,et al.  A hybrid brain–computer interface based on the fusion of electroencephalographic and electromyographic activities , 2011, Journal of neural engineering.

[26]  Brendan Z. Allison,et al.  The Hybrid BCI , 2010, Frontiers in Neuroscience.

[27]  Vera Kaiser,et al.  Switching between Manual Control and Brain-Computer Interface Using Long Term and Short Term Quality Measures , 2011, Front. Neurosci..

[28]  Kee-Eung Kim,et al.  A POMDP Approach to P 300 Brain-Computer Interfaces * , 2010 .

[29]  John R. Smith,et al.  Steady-State VEP-Based Brain-Computer Interface Control in an Immersive 3D Gaming Environment , 2005, EURASIP J. Adv. Signal Process..

[30]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[31]  M Congedo,et al.  A review of classification algorithms for EEG-based brain–computer interfaces , 2007, Journal of neural engineering.

[32]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[33]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[34]  R G Radwin,et al.  Evaluation of a modified Fitts law brain–computer interface target acquisition task in able and motor disabled individuals , 2009, Journal of neural engineering.

[35]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[36]  Shaohan Hu,et al.  NeuroPhone: brain-mobile phone interface using a wireless EEG headset , 2010, MobiHeld '10.

[37]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.