Classifying P300 responses to vowel stimuli for auditory brain-computer interface

A brain-computer interface (BCI) is a technology for operating computerized devices based on brain activity and without muscle movement. BCI technology is expected to become a communication solution for amyotrophic lateral sclerosis (ALS) patients. Recently the BCI2000 package application has been commonly used by BCI researchers. The P300 speller included in the BCI2000 is an application allowing the calculation of a classifier necessary for the user to spell letters or sentences in a BCI-speller paradigm. The BCI-speller is based on visual cues, and requires muscle activities such as eye movements, impossible to execute by patients in a totally locked-in state (TLS), which is a terminal stage of the ALS illness. The purpose of our project is to solve this problem, and we aim to develop an auditory BCI as a solution. However, contemporary auditory BCI-spellers are much weaker compared with a visual modality. Therefore there is a necessity for improvement before practical application. In this paper, we focus on an approach related to the differences in responses evoked by various acoustic BCI-speller related stimulus types. In spite of various event related potential waveform shapes, typically a classifier in the BCI speller discriminates only between targets and non-targets, and hence it ignores valuable and possibly discriminative features. Therefore, we expect that the classification accuracy could be improved by using an independent classifier for each of the stimulus cue categories. In this paper, we propose two classifier training methods. The first one uses the data of the five stimulus cues independently. The second method incorporates weighting for each stimulus cue feature in relation to all of them. The results of the experiments reported show the effectiveness of the second method for classification improvement.

[1]  Valer Jurcak,et al.  10/20, 10/10, and 10/5 systems revisited: Their validity as relative head-surface-based positioning systems , 2007, NeuroImage.

[2]  J. Wolpaw,et al.  Brain-Computer Interfaces: Principles and Practice , 2012 .

[3]  Andrzej Cichocki,et al.  Spatial auditory paradigms for brain computer/machine interfacing , 2009 .

[4]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[5]  N. Birbaumer,et al.  BCI2000: a general-purpose brain-computer interface (BCI) system , 2004, IEEE Transactions on Biomedical Engineering.

[6]  Marissa L. Gamble,et al.  N2ac: an ERP component associated with the focusing of attention within an auditory scene. , 2011, Psychophysiology.

[7]  Jonathan R. Wolpaw,et al.  Brain–Computer InterfacesPrinciples and Practice , 2012 .

[8]  Danilo P. Mandic,et al.  Comparison of P300 Responses in Auditory, Visual and Audiovisual Spatial Speller BCI Paradigms , 2013, ArXiv.

[9]  Shoji Makino,et al.  Spatial auditory BCI paradigm based on real and virtual sound image generation , 2013, 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference.

[10]  Shoji Makino,et al.  Auditory steady-state response stimuli based BCI application - the optimization of the stimuli types and lengths , 2012, Proceedings of The 2012 Asia Pacific Signal and Information Processing Association Annual Summit and Conference.

[11]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.