Asynchronous gaze-independent event-related potential-based brain-computer interface

OBJECTIVE In this study a gaze independent event related potential (ERP)-based brain computer interface (BCI) for communication purpose was combined with an asynchronous classifier endowed with dynamical stopping feature. The aim was to evaluate if and how the performance of such asynchronous system could be negatively affected in terms of communication efficiency and robustness to false positives during the intentional no-control state. MATERIAL AND METHODS The proposed system was validated with the participation of 9 healthy subjects. A comparison was performed between asynchronous and synchronous classification technique outputs while users were controlling the same gaze independent BCI interface. The performance of both classification techniques were assessed both off-line and on-line by means of the efficiency metric introduced by Bianchi et al. (2007). This latter metric allows to set a different misclassification cost for wrong classifications and abstentions. Robustness was evaluated as the rate of false positives occurring during voluntary no-control states. RESULTS The asynchronous classifier did not exhibited significantly higher accuracy or lower error rate with respect to the synchronous classifier (accuracy: 74.66% versus 87.96%, error rate: 7.11% versus 12.04% respectively). However, the on-line and off-line analysis revealed that the communication efficiency was significantly improved (p<.05) with the asynchronous classification modality as compared with the synchronous. Furthermore, the asynchronous classifier proved to be robust to false positives during intentional no-control state which occur during the ongoing visual stimulation (less than 1 false positive every 6min). CONCLUSION As such, the proposed ERP-BCI system which combines an asynchronous classifier with a gaze independent interface is a promising solution to be further explored in order to increase the general usability of ERP-based BCI systems designed for severely disabled people with an impairment of the voluntary control of eye movements. In fact, the asynchronous classifier can improve communication efficiency automatically adapting the number of stimulus repetitions to the current user's state and suspending the control if he/she does not intend to select an item.

[1]  D. Mattia,et al.  Evaluation of the performances of different P300 based brain–computer interfaces by means of the efficiency metric , 2012, Journal of Neuroscience Methods.

[2]  G. Cardarilli,et al.  Performances Evaluation and Optimization of Brain Computer Interface Systems in a Copy Spelling Task , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[3]  F. Babiloni,et al.  A covert attention P300-based brain–computer interface: Geospell , 2012, Ergonomics.

[4]  John J. Foxe,et al.  Visual spatial attention tracking using high-density SSVEP data for independent brain-computer communication , 2005, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[5]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[6]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..

[7]  Shangkai Gao,et al.  An N200 speller integrating the spatial profile for the detection of the non-control state , 2012, Journal of neural engineering.

[8]  G.E. Birch,et al.  Brain interface research for asynchronous control applications , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  M. Nuttin,et al.  A brain-actuated wheelchair: Asynchronous and non-invasive Brain–computer interfaces for continuous control of robots , 2008, Clinical Neurophysiology.

[10]  Cuntai Guan,et al.  Asynchronous P300-Based Brain--Computer Interfaces: A Computational Approach With Statistical Models , 2008, IEEE Transactions on Biomedical Engineering.

[11]  Febo Cincotti,et al.  Out of the frying pan into the fire--the P300-based BCI faces real-world challenges. , 2011, Progress in brain research.

[12]  Xingyu Wang,et al.  An adaptive P300-based control system , 2011, Journal of neural engineering.

[13]  L. R. Quitadamo,et al.  Which Physiological Components are More Suitable for Visual ERP Based Brain–Computer Interface? A Preliminary MEG/EEG Study , 2010, Brain Topography.

[14]  F Babiloni,et al.  P300-based brain–computer interface for environmental control: an asynchronous approach , 2011, Journal of neural engineering.

[15]  C. Rorden,et al.  Covert orienting of attention and overt eye movements activate identical brain regions , 2008, Brain Research.

[16]  N. Birbaumer,et al.  BCI2000: a general-purpose brain-computer interface (BCI) system , 2004, IEEE Transactions on Biomedical Engineering.

[17]  J. Wolpaw,et al.  Brain-Computer Interfaces: Principles and Practice , 2012 .

[18]  M S Treder,et al.  Gaze-independent brain–computer interfaces based on covert attention and feature attention , 2011, Journal of neural engineering.

[19]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[20]  J. Mourino,et al.  Asynchronous BCI and local neural classifiers: an overview of the adaptive brain interface project , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[21]  Pablo F. Diez,et al.  Asynchronous BCI control using high-frequency SSVEP , 2011, Journal of NeuroEngineering and Rehabilitation.

[22]  Benjamin Blankertz,et al.  A novel brain-computer interface based on the rapid serial visual presentation paradigm , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[23]  J. Wolpaw,et al.  Towards an independent brain–computer interface using steady state visual evoked potentials , 2008, Clinical Neurophysiology.

[24]  Ying Sun,et al.  Asynchronous P300 BCI: SSVEP-based control state detection , 2010, 2010 18th European Signal Processing Conference.

[25]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[26]  Gernot R. Müller-Putz,et al.  Self-Paced (Asynchronous) BCI Control of a Wheelchair in Virtual Environments: A Case Study with a Tetraplegic , 2007, Comput. Intell. Neurosci..

[27]  A. Lenhardt,et al.  An Adaptive P300-Based Online Brain–Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[28]  Donatella Mattia,et al.  A Brain-Computer Interface as Input Channel for a Standard Assistive Technology Software , 2011, Clinical EEG and neuroscience.

[29]  Clemens Brunner,et al.  An adaptive P 300-based control system , 2011 .

[30]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[31]  Febo Cincotti,et al.  Control or no-control? reducing the gap between Brain-Computer Interface and classical input devices , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[32]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[33]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[34]  F. Cincotti,et al.  Eye-gaze independent EEG-based brain–computer interfaces for communication , 2012, Journal of neural engineering.

[35]  Febo Cincotti,et al.  Asynchronous P300-Based Brain-Computer Interface to Control a Virtual Environment: Initial Tests on End Users , 2011, Clinical EEG and neuroscience.

[36]  J.P. Donoghue,et al.  BCI meeting 2005-workshop on clinical issues and applications , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[37]  F Babiloni,et al.  A comparison of classification techniques for a gaze-independent P300-based brain-computer interface. , 2012, Journal of neural engineering.