A P300 brain-computer interface for controlling a mobile robot by issuing a motion command

In this paper, to develop a mobile robot by using a BCI to issue a motion command including turning left and right and going forward, we propose a new P300 Brain-computer interface (BCI). Three kinds of P300 visual stimuli were first designed. And then we developed the corresponding BCI systems based on the three stimuli and compared their performance by using a linear discriminant analysis classifier with the features selected from EEG potentials via principal component analysis. Experimental results from three participants suggest that the BCIs with the last two kinds of stimuli can select a desired command with over 80% accuracy in about four seconds of selection time, showing that using the proposed BCI to control a robot is feasible.

[1]  E Donchin,et al.  The mental prosthesis: assessing the speed of a P300-based brain-computer interface. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[2]  Kazuo Tanaka,et al.  Electroencephalogram-based control of an electric wheelchair , 2005, IEEE Transactions on Robotics.

[3]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[4]  Yili Liu,et al.  EEG-Based Brain-Controlled Mobile Robots: A Survey , 2013, IEEE Transactions on Human-Machine Systems.

[5]  Christoph Guger,et al.  Controlling a robot with a brain-computer interface based on steady state visual evoked potentials , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[6]  D. Erdogmus,et al.  Brain controlled robotic platform using steady state visual evoked potentials acquired by EEG , 2010, 2010 Conference Record of the Forty Fourth Asilomar Conference on Signals, Systems and Computers.

[7]  A. Karim,et al.  Neural Internet: Web Surfing with Brain Potentials for the Completely Paralyzed , 2006, Neurorehabilitation and neural repair.

[8]  M. Nuttin,et al.  Asynchronous non-invasive brain-actuated control of an intelligent wheelchair , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[9]  Kyuwan Choi,et al.  Control of a Wheelchair by Motor Imagery in Real Time , 2008, IDEAL.

[10]  Brice Rebsamen,et al.  A brain controlled wheelchair to navigate in familiar environments. , 2010, IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[11]  D J McFarland,et al.  An EEG-based brain-computer interface for cursor control. , 1991, Electroencephalography and clinical neurophysiology.

[12]  Christian Mandel,et al.  Navigating a smart wheelchair with a brain-computer interface interpreting steady-state visual evoked potentials , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Iñaki Iturrate,et al.  A Noninvasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation , 2009, IEEE Transactions on Robotics.

[14]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..