A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli

To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications.

[1]  Yijun Wang,et al.  Visual and Auditory Brain–Computer Interfaces , 2014, IEEE Transactions on Biomedical Engineering.

[2]  Xinjun Sheng,et al.  Sensory Stimulation Training for BCI System Based on Somatosensory Attentional Orientation , 2019, IEEE Transactions on Biomedical Engineering.

[3]  Marcel Kinsbourne,et al.  Attention and the right-ear advantage: What is the connection? , 2011, Brain and Cognition.

[4]  A. Kübler,et al.  Effects of training and motivation on auditory P300 brain–computer interface performance , 2016, Clinical Neurophysiology.

[5]  Kiralee M. Hayashi,et al.  3D mapping of language networks in clinical and pre-clinical Alzheimer’s disease , 2008, Brain and Language.

[6]  Xueyuan Xu,et al.  The Use of Multivariate EMD and CCA for Denoising Muscle Artifacts From Few-Channel EEG Recordings , 2018, IEEE Transactions on Instrumentation and Measurement.

[7]  Brigitte Röder,et al.  Cross-Modal Training Induces Changes in Spatial Representations Early in the Auditory Processing Pathway , 2011, Psychological science.

[8]  Derek K. Jones,et al.  Perisylvian language networks of the human brain , 2005, Annals of neurology.

[9]  Andrew D Wilson,et al.  The masked priming toolbox: an open-source MATLAB toolbox for masked priming researchers , 2011, Behavior research methods.

[10]  Monica Gori,et al.  Enhanced audio-tactile multisensory interaction in a peripersonal task after echolocation , 2019, Experimental Brain Research.

[11]  Xingyu Wang,et al.  A P300 Brain-Computer Interface Based on a Modification of the Mismatch Negativity Paradigm , 2015, Int. J. Neural Syst..

[12]  F. Cincotti,et al.  Eye-gaze independent EEG-based brain–computer interfaces for communication , 2012, Journal of neural engineering.

[13]  David J. C. MacKay,et al.  Bayesian Interpolation , 1992, Neural Computation.

[14]  Jiahui Pan,et al.  A gaze-independent audiovisual brain-computer Interface for detecting awareness of patients with disorders of consciousness , 2018, BMC Neurology.

[15]  Olaf Blanke,et al.  Audio-Tactile and Peripersonal Space Processing Around the Trunk in Human Parietal and Temporal Cortex: An Intracranial EEG Study , 2018, bioRxiv.

[16]  E Donchin,et al.  Brain-computer interface technology: a review of the first international meeting. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[17]  Jan B. F. van Erp,et al.  A Tactile P300 Brain-Computer Interface , 2010, Front. Neurosci..

[18]  K. A. Colwell,et al.  Channel selection methods for the P300 Speller , 2014, Journal of Neuroscience Methods.

[19]  Dewen Hu,et al.  An Tactile ERP-Based Brain–Computer Interface for Communication , 2019, Int. J. Hum. Comput. Interact..

[20]  Dong Ming,et al.  Exploring Combinations of Auditory and Visual Stimuli for Gaze-Independent Brain-Computer Interfaces , 2014, PloS one.

[21]  J. W. Minett,et al.  Optimizing the P300-based brain–computer interface: current status, limitations and future directions , 2011, Journal of neural engineering.

[22]  You Wang,et al.  [Neural dynamics of cognitive flexibility: spatiotemporal analysis of event-related potentials]. , 2017, Nan fang yi ke da xue xue bao = Journal of Southern Medical University.

[23]  Bo Hong,et al.  Employing an active mental task to enhance the performance of auditory attention-based brain–computer interfaces , 2013, Clinical Neurophysiology.

[24]  T. Chau,et al.  A Review of EEG-Based Brain-Computer Interfaces as Access Pathways for Individuals with Severe Disabilities , 2013, Assistive technology : the official journal of RESNA.

[25]  H. Adeli,et al.  Brain-computer interface technologies: from signal to action , 2013, Reviews in the neurosciences.

[26]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[27]  S. Coyle,et al.  Brain–computer interfaces: a review , 2003 .

[28]  Benjamin Blankertz,et al.  Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP) , 2013, Clinical Neurophysiology.

[29]  Abderrahmane Kheddar,et al.  The Role of Audio-Visual Feedback in a Thought-Based Control of a Humanoid Robot: A BCI Study in Healthy and Spinal Cord Injured People , 2017, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[30]  Y. Cohen,et al.  The what, where and how of auditory-object perception , 2013, Nature Reviews Neuroscience.

[31]  Scott E. Kerick,et al.  Brain–Computer Interface Technologies in the Coming Decades , 2012, Proceedings of the IEEE.

[32]  A. Kübler,et al.  Toward brain-computer interface based wheelchair control utilizing tactually-evoked event-related potentials , 2014, Journal of NeuroEngineering and Rehabilitation.

[33]  Urbano Nunes,et al.  Toward a reliable gaze-independent hybrid BCI combining visual and natural auditory stimuli , 2016, Journal of Neuroscience Methods.

[34]  Shangkai Gao,et al.  An Auditory Brain–Computer Interface Using Active Mental Response , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[35]  Bjørn Sætrevik,et al.  The right ear advantage revisited: Speech lateralisation in dichotic listening using consonant–vowel and vowel–consonant syllables , 2012, Laterality.

[36]  Andrea Kübler,et al.  Wheelchair control by elderly participants in a virtual environment with a brain-computer interface (BCI) and tactile stimulation , 2016, Biological Psychology.

[37]  M S Treder,et al.  Gaze-independent brain–computer interfaces based on covert attention and feature attention , 2011, Journal of neural engineering.

[38]  Bernhard Schölkopf,et al.  An Auditory Paradigm for Brain-Computer Interfaces , 2004, NIPS.

[39]  Yang Yu,et al.  A novel Morse code-inspired method for multiclass motor imagery brain-computer interface (BCI) design , 2015, Comput. Biol. Medicine.

[40]  Isao Nambu,et al.  Improving the Performance of an Auditory Brain-Computer Interface Using Virtual Sound Sources by Shortening Stimulus Onset Asynchrony , 2018, Front. Neurosci..

[41]  Tobias Kaufmann,et al.  Comparison of tactile, auditory, and visual modality for brain-computer interface use: a case study with a patient in the locked-in state , 2013, Front. Neurosci..

[42]  Frédéric E Theunissen,et al.  What's that sound? Auditory area CLM encodes stimulus surprise, not intensity or intensity changes. , 2008, Journal of neurophysiology.

[43]  J. Buford,et al.  Combined corticospinal and reticulospinal effects on upper limb muscles , 2014, Neuroscience Letters.

[44]  Weidong Zhou,et al.  Epileptic Seizure Detection Using Lacunarity and Bayesian Linear Discriminant Analysis in Intracranial EEG , 2013, IEEE Transactions on Biomedical Engineering.

[45]  N Birbaumer,et al.  Decoding of motor intentions from epidural ECoG recordings in severely paralyzed chronic stroke patients , 2014, Journal of neural engineering.

[46]  J. Buford,et al.  Brain–Computer Interface after Nervous System Injury , 2014, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry.

[47]  Dong Ming,et al.  A hybrid BCI speller paradigm combining P300 potential and the SSVEP blocking feature , 2013, Journal of neural engineering.

[48]  A. Kübler,et al.  Training leads to increased auditory brain–computer interface performance of end-users with motor impairments , 2016, Clinical Neurophysiology.

[49]  Dong Ming,et al.  Incorporation of dynamic stopping strategy into the high-speed SSVEP-based BCIs , 2018, Journal of neural engineering.

[50]  Xingyu Wang,et al.  Towards correlation-based time window selection method for motor imagery BCIs , 2018, Neural Networks.

[51]  Takayuki Nishimura,et al.  Effect of empathy trait on attention to various facial expressions: evidence from N170 and late positive potential (LPP) , 2014, Journal of Physiological Anthropology.

[52]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[53]  Luca Tommasi,et al.  Side biases in humans (Homo sapiens): three ecological studies on hemispheric asymmetries , 2009, Naturwissenschaften.

[54]  D. Hu,et al.  Gaze independent brain–computer speller with covert visual search tasks , 2011, Clinical Neurophysiology.

[55]  Brendan Z. Allison,et al.  P300 brain computer interface: current challenges and emerging trends , 2012, Front. Neuroeng..

[56]  E. John,et al.  Evoked-Potential Correlates of Stimulus Uncertainty , 1965, Science.

[57]  Andrzej Cichocki,et al.  An improved P300 pattern in BCI to catch user’s attention , 2017, Journal of neural engineering.

[58]  O Bertrand,et al.  A robust sensor-selection method for P300 brain–computer interfaces , 2011, Journal of neural engineering.

[59]  John J. Foxe,et al.  “What” and “Where” in Auditory Sensory Processing: A High-Density Electrical Mapping Study of Distinct Neural Processes Underlying Sound Object Recognition and Sound Localization , 2010, Front. Integr. Neurosci..

[60]  David J. Ostry,et al.  Temporal factors affecting somatosensory–auditory interactions in speech processing , 2014, Front. Psychol..

[61]  Charles Spence,et al.  Audiotactile interactions in front and rear space , 2011, Neuroscience & Biobehavioral Reviews.

[62]  Tzyy-Ping Jung,et al.  A Brain–Computer Interface Based on Miniature-Event-Related Potentials Induced by Very Small Lateral Visual Stimuli , 2018, IEEE Transactions on Biomedical Engineering.

[63]  Peter Desain,et al.  Introducing the tactile speller: an ERP-based brain–computer interface for communication , 2012, Journal of neural engineering.

[64]  Anne-Marie Brouwer,et al.  A tactile P 300 brain-computer interface , 2010 .

[65]  Yu Zhang,et al.  An Improved Visual-Tactile P300 Brain Computer Interface , 2017, ICONIP.

[66]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.

[67]  Bill Gardner,et al.  HRTF Measurements of a KEMAR Dummy-Head Microphone , 1994 .

[68]  Marieke E. Thurlings,et al.  Gaze-independent ERP-BCIs: augmenting performance through location-congruent bimodal stimuli , 2014, Front. Syst. Neurosci..

[69]  E. Ldavas,et al.  Auditory Peripersonal Space in Humans , 2002, Journal of Cognitive Neuroscience.

[70]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[71]  Jonathan R Wolpaw,et al.  Brain–computer interface systems: progress and prospects , 2007, Expert review of medical devices.