Brain computer interface (BCI) with EEG signals for automatic vowel recognition based on articulation mode

One of the most promising methods to assist amputated or paralyzed patients in the control of prosthetic devices is the use of a brain computer interface (BCI). The use of a BCI allows the communication between the brain and the prosthetic device through signal processing protocols. However, due to the noisy nature of the brain signal, available signal processing protocols are unable to correctly interpret the brain commands and cannot be used beyond the laboratory setting. To address this challenge, in this work we present a novel automatic brain signal recognition protocol based on vowel articulation mode. This approach identifies the mental state of imagery of open-mid and closed vowels without the imagination of the movement of the oral cavity, for its application in prosthetic device control. The method consists on using brain signals of the language area (21 electrodes) with the specific task of thinking the respective vowel. In the prosecution stage, the power spectral density (PSD) was calculated for each one of the brain signals, carrying out the classification process with a Support Vector Machine (SVM). A measurement of precision was achieved in the recognition of the vowels according to the articulation way between 84% and 94%. The proposed method is promissory for the use of amputated or paraplegic patients.

[1]  Cuntai Guan,et al.  Calibrating EEG-based motor imagery brain-computer interface from passive movement , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[2]  Ronald Phlypo,et al.  EEG sensor selection by sparse spatial filtering in P300 speller brain-computer interface , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[3]  Xiaopei Wu,et al.  Motor Imagery EEG Classification Based on Dynamic ICA Mixing Matrix , 2010, 2010 4th International Conference on Bioinformatics and Biomedical Engineering.

[4]  S. Nishifuji,et al.  EEG changes associated with mental focusing to flicker stimuli under eyes closed condition for SSVEP-based BCI , 2012, 2012 Proceedings of SICE Annual Conference (SICE).

[5]  Yasuharu Koike,et al.  Usability of EEG cortical currents in classification of vowel speech imagery , 2011, 2011 International Conference on Virtual Rehabilitation.

[6]  Arne Robben,et al.  Error-related potential recorded by EEG in the context of a p300 mind speller brain-computer interface , 2010, 2010 IEEE International Workshop on Machine Learning for Signal Processing.

[7]  Dong Ming,et al.  A P300-speller based on event-related spectral perturbation (ERSP) , 2012, 2012 IEEE International Conference on Signal Processing, Communication and Computing (ICSPCC 2012).

[8]  Roozbeh Jafari,et al.  Simultaneous classification of motor imagery and SSVEP EEG signals , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[9]  Jianfeng Hu,et al.  Classification of Motor Imagery EEG Signals Based on Energy Entropy , 2009, 2009 International Symposium on Intelligent Ubiquitous Computing and Education.

[10]  Yu Zhang,et al.  Extending motor imagery by speech imagery for brain-computer interface , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[11]  Kwang-Eun Ko,et al.  Optimized adaptive neuro-fuzzy inference system for motor imagery EEG signals classifications , 2011, 2011 Eighth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD).

[12]  B. V. K. Vijaya Kumar,et al.  Imagined Speech Classification with EEG Signals for Silent Communication: A Preliminary Investigation into Synthetic Telepathy , 2010, 2010 4th International Conference on Bioinformatics and Biomedical Engineering.

[13]  Yu Zhang,et al.  Analysis and classification of speech imagery EEG for BCI , 2013, Biomed. Signal Process. Control..