Towards an EEG-based intelligent wheelchair driving system with vibro-tactile stimuli

Nowadays, the electroencephalography (EEG)-based wheelchair driving system, one of the major applications of brain-computer interface (BCI), that allows an individual with mobility impairments to perform daily living activities independently. In this context, user's intention identifying methods were developed by several research groups using various paradigms for the wheelchair driving. In this study, we use a steady-state somatosensory evoked potential (SSSEP) paradigm, which elicits brain responses to vibro-tactile stimulation of specific frequencies, for a user's intention identification to driving a wheelchair. The main focus of this study is to validate an effectiveness of our SSSEP-based wheelchair driving system via an online experiment with more challenging tasks than our recent study. In our system, a subject concentrated on one of vibro-tactile stimuli (attached on left-hand, right-hand, and foot) selectively for driving wheelchair (corresponding to turn-left, turn-right, and move-forward). Five healthy subjects participated in the online experiment, and the experimental results show that our SSSEP paradigm is suitable to EEG-based intelligent wheelchair driving system.

[1]  G. Berns,et al.  BAD TO WORSE , 1975, The Lancet.

[2]  L. Cohen,et al.  Brain–computer interfaces: communication and restoration of movement in paralysis , 2007, The Journal of physiology.

[3]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..

[4]  Andrzej Cichocki,et al.  Common spatial patterns for steady-state somatosensory evoked potentials , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[5]  Seong-Whan Lee,et al.  Wheelchair Control Based on Steady-State Somatosensory Evoked Potentials , 2015, 2015 IEEE International Conference on Systems, Man, and Cybernetics.

[6]  Miss A.O. Penney (b) , 1974, The New Yale Book of Quotations.

[7]  A. Cichocki,et al.  Steady-state visually evoked potentials: Focus on essential paradigms and future perspectives , 2010, Progress in Neurobiology.

[8]  Heung-Il Suk,et al.  A Novel Bayesian Framework for Discriminative Feature Extraction in Brain-Computer Interfaces , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Stefan Haufe,et al.  Single-trial analysis and classification of ERP components — A tutorial , 2011, NeuroImage.

[10]  Jon A. Mukand,et al.  Neuronal ensemble control of prosthetic devices by a human with tetraplegia , 2006, Nature.

[11]  Salil H. Patel,et al.  Characterization of N200 and P300: Selected Studies of the Event-Related Potential , 2005, International journal of medical sciences.

[12]  CH' , 2018, Dictionary of Upriver Halkomelem.

[13]  D. Regan Human brain electrophysiology: Evoked potentials and evoked magnetic fields in science and medicine , 1989 .

[14]  G. Pfurtscheller,et al.  Steady-state somatosensory evoked potentials: suitable brain signals for brain-computer interfaces? , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[15]  H. Flor,et al.  A spelling device for the paralysed , 1999, Nature.

[16]  Nicolas Y. Masse,et al.  Reach and grasp by people with tetraplegia using a neurally controlled robotic arm , 2012, Nature.

[17]  José del R. Millán,et al.  Brain-Controlled Wheelchairs: A Robotic Architecture , 2013, IEEE Robotics & Automation Magazine.

[18]  W. Marsden I and J , 2012 .

[19]  J. Wolpaw,et al.  Patients with ALS can use sensorimotor rhythms to operate a brain-computer interface , 2005, Neurology.

[20]  G. Pfurtscheller,et al.  „Resonance-like“ Frequencies of Sensorimotor Areas Evoked by Repetitive Tactile Stimulation - Resonanzeffekte in sensomotorischen Arealen, evoziert durch rhythmische taktile Stimulation , 2001, Biomedizinische Technik. Biomedical engineering.