EEG Based Brain Computer Interface for Controlling a Robot Arm Movement Through Thought

Abstract Background The Brain Computer Interfaces (BCI) are devices allowing direct communication between the brain of a user and a machine. This technology can be used by disabled people in order to improve their independence and maximize their capabilities such as finding an object in the environment. Such devices can be realized by the non-invasive measurement of information from the cortex by electroencephalography (EEG). Methods Our work proposes a novel BCI system that consists of controlling a robot arm based on the user's thought. Four subjects (1 female and 3 males) aged between 20 and 29 years have participated to our experiment. They have been instructed to imagine the execution of movements of the right hand, the left hand, both right and left hands or the movement of the feet depending on the protocol established. EMOTIV EPOC headset was used to record neuronal electrical activities from the subject's scalp, these activities were then sent to the computer for analysis. Feature extraction was performed using the Principal Component Analysis (PCA) method combined with the Fast Fourier transform (FFT) spectrum within the frequency band responsible for sensorimotor rhythms (8 Hz–22 Hz). These features were then fed into a Support Vector Machine (SVM) classifier based on a Radial Base Function (RBF) whose outputs were translated into commands to control the robot arm. Results The proposed BCI enabled the control of the robot arm in the four directions: right, left, up and down, achieving an averaged accuracy of 85.45% across all the subjects. Conclusion The results obtained would encourage, with further developments, the use of the proposed BCI to perform more complex tasks such as execution of successive movements or stopping the execution once a searched object is detected. This would provide a useful assistance means for people with motor impairment.

[1]  Abdelhak Mahmoudi,et al.  Efficient detection of P300 using Kernel PCA and support vector machine , 2014, 2014 Second World Conference on Complex Systems (WCCS).

[2]  Yuzhong Zhang,et al.  EEG based automatic left-right hand movement classification , 2012, 2012 24th Chinese Control and Decision Conference (CCDC).

[3]  Mohammad H. Alomari,et al.  Automated Classification of L/R Hand Movement EEG Signals using Advanced Feature Extraction and Machine Learning , 2013, ArXiv.

[4]  G Pfurtscheller,et al.  Separability of EEG signals recorded during right and left motor imagery using adaptive autoregressive parameters. , 1998, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[5]  Dennis J. McFarland,et al.  Brain-Computer Interface Operation of Robotic and Prosthetic Devices , 2008, Computer.

[6]  Gert Pfurtscheller,et al.  Walking from thought , 2006, Brain Research.

[7]  G. Pfurtscheller,et al.  Walking through a virtual city by thought , 2004, The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[8]  Andrés Úbeda,et al.  Development of a Low-cost SVM-based Spontaneous Brain-computer Interface , 2011, IJCCI.

[9]  G. Pfurtscheller,et al.  EEG-based discrimination between imagination of right and left hand movement. , 1997, Electroencephalography and clinical neurophysiology.

[10]  J. Decety Do imagined and executed actions share the same neural substrate? , 1996, Brain research. Cognitive brain research.

[11]  Gert Pfurtscheller,et al.  Motor imagery and direct brain-computer communication , 2001, Proc. IEEE.

[12]  J. Mourino,et al.  Asynchronous BCI and local neural classifiers: an overview of the adaptive brain interface project , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[13]  Yuanqing Li,et al.  A self-training semi-supervised SVM algorithm and its application in an EEG-based brain computer interface speller system , 2008, Pattern Recognit. Lett..

[14]  Vladimir Bostanov,et al.  BCI competition 2003-data sets Ib and IIb: feature extraction from event-related brain potentials with the continuous wavelet transform and the t-value scalogram , 2004, IEEE Transactions on Biomedical Engineering.

[15]  Cuntai Guan,et al.  Multi-class filter bank common spatial pattern for four-class motor imagery BCI , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[16]  Andrés Úbeda,et al.  SVM-based Brain-Machine Interface for controlling a robot arm through four mental tasks , 2015, Neurocomputing.

[17]  J J Vidal,et al.  Toward direct brain-computer communication. , 1973, Annual review of biophysics and bioengineering.

[18]  Amit Konar,et al.  Performance analysis of LDA, QDA and KNN algorithms in left-right limb movement classification from EEG data , 2010, 2010 International Conference on Systems in Medicine and Biology.

[19]  Ajith Pasqual,et al.  Online classification of imagined hand movement using a consumer grade EEG device , 2013, 2013 IEEE 8th International Conference on Industrial and Information Systems.

[20]  G. Pfurtscheller,et al.  Information transfer rate in a five-classes brain-computer interface , 2001, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[21]  Michitaka Hirose,et al.  Brain-Computer Interfaces, Virtual Reality, and Videogames , 2008, Computer.

[22]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[23]  Iñaki Iturrate,et al.  A Noninvasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation , 2009, IEEE Transactions on Robotics.

[24]  A. Vuckovic,et al.  A four-class BCI based on motor imagination of the right and the left hand wrist , 2008, 2008 First International Symposium on Applied Sciences on Biomedical and Communication Technologies.

[25]  G. Vanacker,et al.  Adaptive Shared Control of a Brain-Actuated Simulated Wheelchair , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.