Coordinated control of an intelligentwheelchair based on a brain-computer interface and speech recognition

An intelligent wheelchair is devised, which is controlled by a coordinated mechanism based on a brain-computer interface (BCI) and speech recognition. By performing appropriate activities, users can navigate the wheelchair with four steering behaviors (start, stop, turn left, and turn right). Five healthy subjects participated in an indoor experiment. The results demonstrate the efficiency of the coordinated control mechanism with satisfactory path and time optimality ratios, and show that speech recognition is a fast and accurate supplement for BCI-based control systems. The proposed intelligent wheelchair is especially suitable for patients suffering from paralysis (especially those with aphasia) who can learn to pronounce only a single sound (e.g., ‘ah’).

[1]  Nathalia Peixoto,et al.  Voice controlled wheelchairs: Fine control by humming , 2013, Comput. Methods Programs Biomed..

[2]  Yuanqing Li,et al.  A Hybrid BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control , 2013, IEEE Transactions on Biomedical Engineering.

[3]  Liu Tian,et al.  An assistive system based on ultrasonic sensors for brain-controlled wheelchair to avoid obstacles , 2012, Proceedings of the 31st Chinese Control Conference.

[4]  Yuanqing Li,et al.  A Hybrid Brain Computer Interface to Control the Direction and Speed of a Simulated or Real Wheelchair , 2012, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[5]  O. Bai,et al.  Electroencephalography (EEG)-Based Brain–Computer Interface (BCI): A 2-D Virtual Wheelchair Control Based on Event-Related Desynchronization/Synchronization and State Control , 2012, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  Hu Zhang-fang,et al.  A novel endpoint detection method and its application on human-computer interaction of intelligent wheelchair , 2011 .

[7]  Eduardo Rocon,et al.  Wearable inertial mouse for children with physical and cognitive impairments , 2010 .

[8]  Ying Sun,et al.  Asynchronous P300 BCI: SSVEP-based control state detection , 2010, 2010 18th European Signal Processing Conference.

[9]  Yuanqing Li,et al.  An EEG-Based BCI System for 2-D Cursor Control by Combining Mu/Beta Rhythm and P300 Potential , 2010, IEEE Transactions on Biomedical Engineering.

[10]  Brice Rebsamen,et al.  A brain controlled wheelchair to navigate in familiar environments. , 2010, IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[11]  Brendan Z. Allison,et al.  The Hybrid BCI , 2010, Frontiers in Neuroscience.

[12]  M. Nuttin,et al.  Asynchronous non-invasive brain-actuated control of an intelligent wheelchair , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[13]  Iñaki Iturrate,et al.  A Noninvasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation , 2009, IEEE Transactions on Robotics.

[14]  Xueliang Huo,et al.  A Magneto-Inductive Sensor Based Wireless Tongue-Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[15]  Yuanqing Li,et al.  Joint feature re-extraction and classification using an iterative semi-supervised support vector machine algorithm , 2008, Machine Learning.

[16]  Svein Bua,et al.  [Quality of life in amyotrophic lateral sclerosis]. , 2007, Tidsskrift for den Norske laegeforening : tidsskrift for praktisk medicin, ny raekke.

[17]  Haihong Zhang,et al.  A Brain-Controlled Wheelchair Based on P300 and Path Guidance , 2006, The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006. BioRob 2006..

[18]  Gilles Blanchard,et al.  BCI competition 2003-data set IIa: spatial patterns of self-controlled brain rhythm modulations , 2004, IEEE Transactions on Biomedical Engineering.

[19]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[20]  J. Ormrod,et al.  Artificial speaking device for aphasic children , 1999, Proceedings of the First Joint BMES/EMBS Conference. 1999 IEEE Engineering in Medicine and Biology 21st Annual Conference and the 1999 Annual Fall Meeting of the Biomedical Engineering Society (Cat. N.

[21]  S. Eddy Hidden Markov models. , 1996, Current opinion in structural biology.

[22]  Hsiao-Wuen Hon,et al.  Speaker-independent phone recognition using hidden Markov models , 1989, IEEE Trans. Acoust. Speech Signal Process..

[23]  Sadaoki Furui,et al.  Speaker-independent isolated word recognition using dynamic features of speech spectrum , 1986, IEEE Trans. Acoust. Speech Signal Process..

[24]  L. R. Rabiner,et al.  On the application of vector quantization and hidden Markov models to speaker-independent, isolated word recognition , 1983, The Bell System Technical Journal.