Analysis of Prefrontal Single-Channel EEG Data for Portable Auditory ERP-Based Brain–Computer Interfaces

An electroencephalogram (EEG)-based brain-computer interface (BCI) is a tool to non-invasively control computers by translating the electrical activity of the brain. This technology has the potential to provide patients who have severe generalized myopathy, such as those suffering from amyotrophic lateral sclerosis (ALS), with the ability to communicate. Recently, auditory oddball paradigms have been developed to implement more practical event-related potential (ERP)-based BCIs because they can operate without ocular activities. These paradigms generally make use of clinical (over 16-channel) EEG devices and natural sound stimuli to maintain the user's motivation during the BCI operation; however, most ALS patients who have taken part in auditory ERP-based BCIs tend to complain about the following factors: (i) total device cost and (ii) setup time. The development of a portable auditory ERP-based BCI could overcome considerable obstacles that prevent the use of this technology in communication in everyday life. To address this issue, we analyzed prefrontal single-channel EEG data acquired from a consumer-grade single-channel EEG device using a natural sound-based auditory oddball paradigm. In our experiments, EEG data was gathered from nine healthy subjects and one ALS patient. The performance of auditory ERP-based BCI was quantified under an offline condition and two online conditions. The offline analysis indicated that our paradigm maintained a high level of detection accuracy (%) and ITR (bits/min) across all subjects through a cross-validation procedure (for five commands: 70.0 ± 16.1 and 1.29 ± 0.93, for four commands: 73.8 ± 14.2 and 1.16 ± 0.78, for three commands: 78.7 ± 11.8 and 0.95 ± 0.61, and for two commands: 85.7 ± 8.6 and 0.63 ± 0.38). Furthermore, the first online analysis demonstrated that our paradigm also achieved high performance for new data in an online data acquisition stream (for three commands: 80.0 ± 19.4 and 1.16 ± 0.83). The second online analysis measured online performances on the different day of offline and first online analyses on a different day (for three commands: 62.5 ± 14.3 and 0.43 ± 0.36). These results indicate that prefrontal single-channel EEGs have the potential to contribute to the development of a user-friendly portable auditory ERP-based BCI.

[1]  Sungho Jo,et al.  A novel hybrid auditory BCI paradigm combining ASSR and P300 , 2017, Journal of Neuroscience Methods.

[2]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.

[3]  H. Jasper,et al.  The ten-twenty electrode system of the International Federation. The International Federation of Clinical Neurophysiology. , 1999, Electroencephalography and clinical neurophysiology. Supplement.

[4]  Kiti Müller,et al.  Relationship of P300 single-trial responses with reaction time and preceding stimulus sequence. , 2006, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[5]  Sebastian Halder,et al.  Psychological Predictors of Visual and Auditory P300 Brain-Computer Interface Performance , 2018, Front. Neurosci..

[6]  Michael Tangermann,et al.  Eyes-Closed Increases the Usability of Brain-Computer Interfaces Based on Auditory Event-Related Potentials , 2018, Front. Hum. Neurosci..

[7]  J. Wolpaw,et al.  A practical, intuitive brain–computer interface for communicating ‘yes’ or ‘no’ by listening , 2014, Journal of neural engineering.

[8]  E. Donchin,et al.  A P300-based brain–computer interface: Initial tests by ALS patients , 2006, Clinical Neurophysiology.

[9]  Christoph Braun,et al.  A portable auditory P300 brain–computer interface with directional cues , 2013, Clinical Neurophysiology.

[10]  Ivo Käthner,et al.  An auditory multiclass brain-computer interface with natural stimuli: Usability evaluation with healthy participants and a motor impaired end user , 2015, Front. Hum. Neurosci..

[11]  Benjamin Blankertz,et al.  A Novel 9-Class Auditory ERP Paradigm Driving a Predictive Text Entry System , 2011, Front. Neurosci..

[12]  Ivo Käthner,et al.  Rapid P300 brain-computer interface communication with a head-mounted display , 2015, Front. Neurosci..

[13]  Mariska J Vansteensel,et al.  Brain-computer interfaces for communication. , 2020, Handbook of clinical neurology.

[14]  Francisco J. Pelayo,et al.  Trends in EEG-BCI for daily-life: Requirements for artifact removal , 2017, Biomed. Signal Process. Control..

[15]  A. Kübler,et al.  Training leads to increased auditory brain–computer interface performance of end-users with motor impairments , 2016, Clinical Neurophysiology.

[16]  Xiaogang Chen,et al.  A Benchmark Dataset for SSVEP-Based Brain–Computer Interfaces , 2017, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[17]  Xiaorong Gao,et al.  Design and implementation of a brain-computer interface with high transfer rates , 2002, IEEE Transactions on Biomedical Engineering.

[18]  S. Johnstone,et al.  Test-retest reliability of a single-channel, wireless EEG system. , 2016, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[19]  D. McFarland,et al.  An auditory brain–computer interface (BCI) , 2008, Journal of Neuroscience Methods.

[20]  Chad C. Williams,et al.  Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research , 2017, Front. Neurosci..

[21]  A. Kübler,et al.  Motivation modulates the P300 amplitude during brain–computer interface use , 2010, Clinical Neurophysiology.

[22]  S. Debener,et al.  Towards a truly mobile auditory brain-computer interface: exploring the P300 to take away. , 2014, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[23]  Hyun Jae Baek,et al.  Music and natural sounds in an auditory steady-state response based brain-computer interface to increase user acceptance , 2017, Comput. Biol. Medicine.

[24]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[25]  Antonio Chella,et al.  Reaching and Grasping a Glass of Water by Locked-In ALS Patients through a BCI-Controlled Humanoid Robot , 2017, Front. Hum. Neurosci..

[26]  H. Jasper Report of the committee on methods of clinical examination in electroencephalography , 1958 .

[27]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..

[28]  Andrea Kübler,et al.  Circadian course of the P300 ERP in patients with amyotrophic lateral sclerosis - implications for brain-computer interfaces (BCI) , 2017, BMC Neurology.

[29]  Dennis J. McFarland,et al.  The P300-based brain–computer interface (BCI): Effects of stimulus rate , 2011, Clinical Neurophysiology.

[30]  Dewen Hu,et al.  Usage of drip drops as stimuli in an auditory P300 BCI paradigm , 2018, Cognitive Neurodynamics.

[31]  Bernhard Schölkopf,et al.  An Auditory Paradigm for Brain-Computer Interfaces , 2004, NIPS.

[32]  Toshihiro Kawase,et al.  Affective Stimuli for an Auditory P300 Brain-Computer Interface , 2017, Front. Neurosci..

[33]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[34]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[35]  A. Kübler,et al.  Effects of training and motivation on auditory P300 brain–computer interface performance , 2016, Clinical Neurophysiology.

[36]  Benjamin Blankertz,et al.  Two-dimensional auditory p300 speller with predictive text system , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[37]  Alberto Prieto,et al.  An auditory brain–computer interface evoked by natural speech , 2012, Journal of neural engineering.

[38]  Sibylle C. Herholz,et al.  Tones and numbers: A combined EEG–MEG study on the effects of musical expertise in magnitude comparisons of audiovisual stimuli , 2014, Human brain mapping.