Influence of attention on speech-rhythm evoked potentials: first steps towards an auditory brain-computer interface driven by speech

A brain-computer interface (BCI) uses neuronal responses to control external systems. The majority of BCI systems are based on visual stimuli, only few apply auditory input. Because auditory-based BCI do not rely on visual skills or mobility of the body, they could be an alternative for visually or physically disabled people. This study investigates the performance of an auditory paradigm using two competing streams of repeatedly presented speech syllables. The streams had different repetition rates of 2.3 and 3.1 Hz. Our auditory BCI approach uses the auditory steady-state response (ASSR) to automatically detect which stream a listener selectively attends to. In a single trial classification ten healthy volunteers achieved an accuracy significantly above chance of 61% and an information transfer-rate (ITR) of 0.2 bit min−1. The use of the average over six random trials improved the average classification accuracy to 79% while keeping the ITR comparable. In conclusion it is possible to classify ASSR evoked from streams of spoken syllables. For a real life application it is necessary to improve the performance of this auditory BCI, but it is a step towards the long term goal of using BCI on natural speech features and eventually controlling the processing of hearing devices.

[1]  B. Gandevia,et al.  DECLARATION OF HELSINKI. , 1964, The Medical journal of Australia.

[2]  R. C. Oldfield The assessment and analysis of handedness: the Edinburgh inventory. , 1971, Neuropsychologia.

[3]  M Hoke,et al.  Weighted averaging--theory and application to electric response audiometry. , 1984, Electroencephalography and clinical neurophysiology.

[4]  T W Picton,et al.  Potentials evoked by the sinusoidal modulation of the amplitude or frequency of a tone. , 1987, The Journal of the Acoustical Society of America.

[5]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[6]  W. Bialek,et al.  Naturalistic stimuli increase the rate and efficiency of information transmission by primary auditory afferents , 1995, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[7]  E Donchin,et al.  The mental prosthesis: assessing the speed of a P300-based brain-computer interface. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[8]  E. Donchin,et al.  A P300-based brain–computer interface: Initial tests by ALS patients , 2006, Clinical Neurophysiology.

[9]  N. Birbaumer Breaking the silence: brain-computer interfaces (BCI) for communication and motor control. , 2006, Psychophysiology.

[10]  Michel Besserve,et al.  Classification methods for ongoing EEG and MEG signals. , 2007, Biological research.

[11]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[12]  C. Neuper,et al.  Toward a high-throughput auditory P300-based brain–computer interface , 2009, Clinical Neurophysiology.

[13]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[14]  K. Puri,et al.  Declaration of Helsinki, 2008: implications for stakeholders in research. , 2009, Journal of postgraduate medicine.

[15]  S. David,et al.  Influence of context and behavior on stimulus reconstruction from neural activity in primary auditory cortex. , 2009, Journal of neurophysiology.

[16]  N. Birbaumer,et al.  An auditory oddball brain–computer interface for binary choices , 2010, Clinical Neurophysiology.

[17]  Chang-Hwan Im,et al.  Classification of selective attention to auditory stimuli: Toward vision-free brain–computer interfacing , 2011, Journal of Neuroscience Methods.

[18]  T. Nakamura,et al.  Classification of auditory steady-state responses to speech data , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[19]  Barbara Shinn-Cunningham,et al.  Individual differences in attentional modulation of cortical responses correlate with selective attention performance , 2014, Hearing Research.

[20]  J. Wolpaw,et al.  A practical, intuitive brain–computer interface for communicating ‘yes’ or ‘no’ by listening , 2014, Journal of neural engineering.

[21]  John J. Foxe,et al.  Attentional Selection in a Cocktail Party Environment Can Be Decoded from Single-Trial EEG. , 2015, Cerebral cortex.