Rapid P300 brain-computer interface communication with a head-mounted display

Visual ERP (P300) based brain-computer interfaces (BCIs) allow for fast and reliable spelling and are intended as a muscle-independent communication channel for people with severe paralysis. However, they require the presentation of visual stimuli in the field of view of the user. A head-mounted display could allow convenient presentation of visual stimuli in situations, where mounting a conventional monitor might be difficult or not feasible (e.g., at a patient's bedside). To explore if similar accuracies can be achieved with a virtual reality (VR) headset compared to a conventional flat screen monitor, we conducted an experiment with 18 healthy participants. We also evaluated it with a person in the locked-in state (LIS) to verify that usage of the headset is possible for a severely paralyzed person. Healthy participants performed online spelling with three different display methods. In one condition a 5 × 5 letter matrix was presented on a conventional 22 inch TFT monitor. Two configurations of the VR headset were tested. In the first (glasses A), the same 5 × 5 matrix filled the field of view of the user. In the second (glasses B), single letters of the matrix filled the field of view of the user. The participant in the LIS tested the VR headset on three different occasions (glasses A condition only). For healthy participants, average online spelling accuracies were 94% (15.5 bits/min) using three flash sequences for spelling with the monitor and glasses A and 96% (16.2 bits/min) with glasses B. In one session, the participant in the LIS reached an online spelling accuracy of 100% (10 bits/min) using the glasses A condition. We also demonstrated that spelling with one flash sequence is possible with the VR headset for healthy users (mean: 32.1 bits/min, maximum reached by one user: 71.89 bits/min at 100% accuracy). We conclude that the VR headset allows for rapid P300 BCI communication in healthy users and may be a suitable display option for severely paralyzed persons.

[1]  M. Clerc,et al.  CoAdapt P 300 speller : optimized flashing sequences and online learning , 2014 .

[2]  J. W. Minett,et al.  Optimizing the P300-based brain–computer interface: current status, limitations and future directions , 2011, Journal of neural engineering.

[3]  Jonathan R Wolpaw,et al.  A brain-computer interface for long-term independent home use , 2010, Amyotrophic lateral sclerosis : official publication of the World Federation of Neurology Research Group on Motor Neuron Diseases.

[4]  Naoki Hata,et al.  Towards Intelligent Environments: An Augmented Reality–Brain–Machine Interface Operated with a See-Through Head-Mount Display , 2011, Front. Neurosci..

[5]  E. W. Sellers,et al.  Toward enhanced P300 speller performance , 2008, Journal of Neuroscience Methods.

[6]  J. Wolpaw,et al.  A novel P300-based brain–computer interface stimulus presentation paradigm: Moving beyond rows and columns , 2010, Clinical Neurophysiology.

[7]  M. Eimer Event-related brain potentials distinguish processing stages involved in face perception and recognition , 2000, Clinical Neurophysiology.

[8]  A. Lenhardt,et al.  An Adaptive P300-Based Online Brain–Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  A. Cichocki,et al.  A survey of the dummy face and human face stimuli used in BCI paradigm , 2015, Journal of Neuroscience Methods.

[10]  Gernot R. Müller-Putz,et al.  Brain-controlled applications using dynamic P300 speller matrices , 2015, Artif. Intell. Medicine.

[11]  Tobias Kaufmann,et al.  Beyond maximum speed—a novel two-stimulus paradigm for brain–computer interfaces based on event-related potentials (P300-BCI) , 2014, Journal of neural engineering.

[12]  Wenfeng Feng,et al.  Three stages of facial expression processing: ERP study with rapid serial visual presentation , 2010, NeuroImage.

[13]  Rémi Munos,et al.  CoAdapt P300 speller: optimized flashing sequences and online learning , 2014 .

[14]  E. Donchin,et al.  A P300-based brain–computer interface: Initial tests by ALS patients , 2006, Clinical Neurophysiology.

[15]  J. Polich Updating P300: An integrative theory of P3a and P3b , 2007, Clinical Neurophysiology.

[16]  John Polich,et al.  Olfactory, auditory, and visual ERPs from single trials: no evidence for habituation. , 2004, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[17]  T. Allison,et al.  Electrophysiological Studies of Face Perception in Humans , 1996, Journal of Cognitive Neuroscience.

[18]  J. Polich,et al.  P300 from a single auditory stimulus. , 1994, Electroencephalography and clinical neurophysiology.

[19]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[20]  N. Birbaumer,et al.  Brain–computer interfaces and communication in paralysis: Extinction of goal directed thinking in completely paralysed patients? , 2008, Clinical Neurophysiology.

[21]  N. Birbaumer,et al.  BCI2000: a general-purpose brain-computer interface (BCI) system , 2004, IEEE Transactions on Biomedical Engineering.

[22]  S. Hillyard,et al.  Human auditory evoked potentials. I. Evaluation of components. , 1974, Electroencephalography and clinical neurophysiology.

[23]  I. Fried,et al.  Improved P300 speller performance using electrocorticography, spectral features, and natural language processing , 2013, Clinical Neurophysiology.

[24]  Chang S. Nam,et al.  Effects of Luminosity Contrast and Stimulus Duration on User Performance and Preference in a P300-Based Brain–Computer Interface , 2014, Int. J. Hum. Comput. Interact..

[25]  J. Polich,et al.  P300 in young and elderly subjects: auditory frequency and intensity effects. , 1993, Electroencephalography and clinical neurophysiology.

[26]  A Kübler,et al.  A P 300-based brain-computer interface for people with amyotrophic lateral sclerosis , 2010 .

[27]  Gernot R. Müller-Putz,et al.  Effects of mental workload and fatigue on the P300, alpha and theta band power during operation of an ERP (P300) brain–computer interface , 2014, Biological Psychology.

[28]  John Polich,et al.  P300 from auditory stimuli: intensity and frequency effects , 1995, Biological Psychology.

[29]  Arnaud Delorme,et al.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis , 2004, Journal of Neuroscience Methods.

[30]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[31]  J. Polich,et al.  P300 from a single-stimulus paradigm: passive versus active tasks and stimulus modality. , 1997, Electroencephalography and clinical neurophysiology.

[32]  N. Birbaumer,et al.  Brain-computer communication: self-regulation of slow cortical potentials for verbal communication. , 2001, Archives of physical medicine and rehabilitation.

[33]  A. Kübler,et al.  Face stimuli effectively prevent brain–computer interface inefficiency in patients with neurodegenerative disease , 2013, Clinical Neurophysiology.

[34]  E. Sellers,et al.  How many people are able to control a P300-based brain–computer interface (BCI)? , 2009, Neuroscience Letters.

[35]  J. Wolpaw,et al.  A P300-based brain–computer interface for people with amyotrophic lateral sclerosis , 2008, Clinical Neurophysiology.

[36]  Gerwin Schalk,et al.  Rapid Communication with a “P300” Matrix Speller Using Electrocorticographic Signals (ECoG) , 2010, Front. Neurosci..

[37]  C. Joyce,et al.  The face-sensitive N170 and VPP components manifest the same brain processes: The effect of reference electrode site , 2005, Clinical Neurophysiology.

[38]  John Polich,et al.  P300 from a single-stimulus paradigm: auditory intensity and tone frequency effects , 1997, Biological Psychology.

[39]  A. Kübler,et al.  Flashing characters with famous faces improves ERP-based brain–computer interface performance , 2011, Journal of neural engineering.

[40]  N. Sagiv,et al.  Structural Encoding of Human and Schematic Faces: Holistic and Part-Based Processes , 2001, Journal of Cognitive Neuroscience.

[41]  D. Jeffreys Evoked Potential Studies of Face and Object Processing , 1996 .

[42]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[43]  J. Wolpaw,et al.  A P300-based brain–computer interface for people with amyotrophic lateral sclerosis , 2008, Clinical Neurophysiology.

[44]  Febo Cincotti,et al.  Out of the frying pan into the fire--the P300-based BCI faces real-world challenges. , 2011, Progress in brain research.

[45]  Christa Neuper,et al.  Walking by Thinking: The Brainwaves Are Crucial, Not the Muscles! , 2006, PRESENCE: Teleoperators and Virtual Environments.

[46]  V. Goffaux,et al.  Spatio-temporal localization of the face inversion effect: an event-related potentials study , 1999, Biological Psychology.

[47]  L. Deouell,et al.  STRUCTURAL ENCODING AND IDENTIFICATION IN FACE PROCESSING: ERP EVIDENCE FOR SEPARATE MECHANISMS , 2000, Cognitive neuropsychology.

[48]  J. Wolpaw,et al.  A P300 event-related potential brain–computer interface (BCI): The effects of matrix size and inter stimulus interval on performance , 2006, Biological Psychology.

[49]  Steven L. Johnson,et al.  A P300-Based Brain–Computer Interface: Effects of Interface Type and Screen Size , 2010, Int. J. Hum. Comput. Interact..

[50]  D. Jeffreys A face-responsive potential recorded from the human scalp , 2004, Experimental Brain Research.

[51]  Steven J. Luck,et al.  ERPLAB: an open-source toolbox for the analysis of event-related potentials , 2014, Front. Hum. Neurosci..

[52]  J. Covington,et al.  P300, stimulus intensity, and modality. , 1996, Electroencephalography and clinical neurophysiology.