Tactile and bone-conduction auditory brain computer interface for vision and hearing impaired users

BACKGROUND The paper presents a report on the recently developed BCI alternative for users suffering from impaired vision (lack of focus or eye-movements) or from the so-called "ear-blocking-syndrome" (limited hearing). We report on our recent studies of the extents to which vibrotactile stimuli delivered to the head of a user can serve as a platform for a brain computer interface (BCI) paradigm. NEW METHOD In the proposed tactile and bone-conduction auditory BCI novel multiple head positions are used to evoke combined somatosensory and auditory (via the bone conduction effect) P300 brain responses, in order to define a multimodal tactile and bone-conduction auditory brain computer interface (tbcaBCI). In order to further remove EEG interferences and to improve P300 response classification synchrosqueezing transform (SST) is applied. SST outperforms the classical time-frequency analysis methods of the non-linear and non-stationary signals such as EEG. The proposed method is also computationally more effective comparing to the empirical mode decomposition. The SST filtering allows for online EEG preprocessing application which is essential in the case of BCI. RESULTS Experimental results with healthy BCI-naive users performing online tbcaBCI, validate the paradigm, while the feasibility of the concept is illuminated through information transfer rate case studies. COMPARISON WITH EXISTING METHOD(S) We present a comparison of the proposed SST-based preprocessing method, combined with a logistic regression (LR) classifier, together with classical preprocessing and LDA-based classification BCI techniques. CONCLUSIONS The proposed tbcaBCI paradigm together with data-driven preprocessing methods are a step forward in robust BCI applications research.

[1]  N. Huang,et al.  The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis , 1998, Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences.

[2]  Peter Desain,et al.  Introducing the tactile speller: an ERP-based brain–computer interface for communication , 2012, Journal of neural engineering.

[3]  Shoji Makino,et al.  Multi-command Chest Tactile Brain Computer Interface for Small Vehicle Robot Navigation , 2013, Brain and Health Informatics.

[4]  Toshihisa Tanaka,et al.  Multichannel spectral pattern separation - An EEG processing application - , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[5]  Jason Farquhar,et al.  A multi-signature brain–computer interface: use of transient and steady-state responses , 2013, Journal of neural engineering.

[6]  Tobias Kaufmann,et al.  Comparison of tactile, auditory, and visual modality for brain-computer interface use: a case study with a patient in the locked-in state , 2013, Front. Neurosci..

[7]  Danilo P. Mandic,et al.  Multi-command Tactile and Auditory Brain Computer Interface based on Head Position Stimulation , 2013, ArXiv.

[8]  Jan B. F. van Erp,et al.  A Tactile P300 Brain-Computer Interface , 2010, Front. Neurosci..

[9]  Valer Jurcak,et al.  10/20, 10/10, and 10/5 systems revisited: Their validity as relative head-surface-based positioning systems , 2007, NeuroImage.

[10]  N. Birbaumer,et al.  An auditory oddball brain–computer interface for binary choices , 2010, Clinical Neurophysiology.

[11]  Hitoshi Ogawa,et al.  Multi-command Tactile Brain Computer Interface: A Feasibility Study , 2013, HAID.

[12]  Anne-Marie Brouwer,et al.  A tactile P 300 brain-computer interface , 2010 .

[13]  Laurent Bougrain,et al.  Finally, what is the best filter for P300 detection? , 2012 .

[14]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.

[15]  Chih-Jen Lin,et al.  LIBLINEAR: A Library for Large Linear Classification , 2008, J. Mach. Learn. Res..

[16]  Tomasz M. Rutkowski,et al.  Novel virtual moving sound-based spatial auditory brain-computer interface paradigm , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[17]  D. P. Mandic,et al.  Multivariate empirical mode decomposition , 2010, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[18]  Toshihisa Tanaka,et al.  Multivariate EMD based approach to EOG artifacts separation from EEG , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[19]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[20]  G. Pfurtscheller,et al.  Steady-state somatosensory evoked potentials: suitable brain signals for brain-computer interfaces? , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[21]  I. Daubechies,et al.  Synchrosqueezed wavelet transforms: An empirical mode decomposition-like tool , 2011 .

[22]  Danilo P. Mandic,et al.  Empirical Mode Decomposition-Based Time-Frequency Analysis of Multivariate Signals: The Power of Adaptive Data Analysis , 2013, IEEE Signal Processing Magazine.

[23]  Toshihisa Tanaka,et al.  Phase synchronization analysis of EEG channels using bivariate empirical mode decomposition , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[24]  Victor V. Kryssanov,et al.  Vibrotactile stimulus frequency optimization for the haptic BCI prototype , 2012, The 6th International Conference on Soft Computing and Intelligent Systems, and The 13th International Symposium on Advanced Intelligence Systems.

[25]  Andrzej Cichocki,et al.  Spatial auditory paradigms for brain computer/machine interfacing , 2009 .

[26]  Paula P. Henry,et al.  Spatial audio through a bone conduction interface , 2006, International journal of audiology.