A study of three target brain-computer interface based on auditory steady-state response

Auditory-based brain-computer interface (BCI) technology is needed to provide a communication approach for patients with visual impairments. For above goal, this paper constructs one auditory-based BCI via auditory steady-state response (ASSR) in the case of three targets. In this paper, the ASSR is stimulated by the click sound of different frequencies, which are 38 Hz for left channel, 40 Hz for both left and right channels, and 42 Hz for right channel. During the experiment, the subjects are focused on the corresponding stimulus according to the prompt sound. Data from 11 subjects are analyzed by spectral analysis and canonical correlation analysis (CCA). The Analysis result is any one of the three stimuli can evoke stable ASSR, and the ASSR is strongest at the frequency of 40 Hz. Using CCA for three classifications, the information transmission rate reaches the highest value of 4.74 bits/min and the average classification accuracy rate is 80.79% when the data length is 3 s. This study paradigm provides an opportunity for the ASSR to construct an auditory BCI.

[1]  Shoji Makino,et al.  Auditory steady-state response stimuli based BCI application - the optimization of the stimuli types and lengths , 2012, Proceedings of The 2012 Asia Pacific Signal and Information Processing Association Annual Summit and Conference.

[2]  S. Makeig,et al.  A 40-Hz auditory potential recorded from the human scalp. , 1981, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Sungho Jo,et al.  A novel hybrid auditory BCI paradigm combining ASSR and P300 , 2017, Journal of Neuroscience Methods.

[4]  T. Nakamura,et al.  Classification of auditory steady-state responses to speech data , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[5]  Wei Wu,et al.  Frequency Recognition Based on Canonical Correlation Analysis for SSVEP-Based BCIs , 2006, IEEE Transactions on Biomedical Engineering.

[6]  Antônio Maurício Ferreira Leite Miranda de Sá,et al.  Comparison of univariate and multivariate magnitude-squared coherences in the detection of human 40-Hz auditory steady-state evoked responses , 2018, Biomed. Signal Process. Control..

[7]  Hyun Jae Baek,et al.  Music and natural sounds in an auditory steady-state response based brain-computer interface to increase user acceptance , 2017, Comput. Biol. Medicine.

[8]  Chang-Hwan Im,et al.  A vision-free brain-computer interface (BCI) paradigm based on auditory selective attention , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[9]  Wei Wu,et al.  Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs , 2007, IEEE Transactions on Biomedical Engineering.

[10]  Arnaud Delorme,et al.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis , 2004, Journal of Neuroscience Methods.

[11]  M. Spüler,et al.  Alpha-band lateralization during auditory selective attention for brain–computer interface control , 2018 .

[12]  Lars Kai Hansen,et al.  ERPWAVELAB A toolbox for multi-channel analysis of time–frequency transformed event related potentials , 2007, Journal of Neuroscience Methods.

[13]  Leonardo Bonato Felix,et al.  Vision-Free Brain-Computer Interface using auditory selective attention: evaluation of training effect , 2016 .

[14]  Sungho Jo,et al.  The effect of selective attention on multiple ASSRs for future BCI application , 2017, 2017 5th International Winter Conference on Brain-Computer Interface (BCI).

[15]  Marc Recasens,et al.  The 40-Hz Auditory Steady-State Response in Patients With Schizophrenia: A Meta-analysis. , 2016, JAMA psychiatry.

[16]  B. O’Donnell,et al.  Steady state responses: electrophysiological assessment of sensory function in schizophrenia. , 2009, Schizophrenia bulletin.

[17]  T. Koenig,et al.  Global field synchronization of 40 Hz auditory steady-state response: Does it change with attentional demands? , 2018, Neuroscience Letters.