Brain-controlled telepresence robot by motor-disabled people

In this paper we present the first results of users with disabilities in mentally controlling a telepresence robot, a rather complex task as the robot is continuously moving and the user must control it for a long period of time (over 6 minutes) to go along the whole path. These two users drove the telepresence robot from their clinic more than 100 km away. Remarkably, although the patients had never visited the location where the telepresence robot was operating, they achieve similar performances to a group of four healthy users who were familiar with the environment. In particular, the experimental results reported in this paper demonstrate the benefits of shared control for brain-controlled telepresence robots. It allows all subjects (including novel BMI subjects as our users with disabilities) to complete a complex task in similar time and with similar number of commands to those required by manual control.

[1]  José del R. Millán,et al.  The role of shared-control in BCI-based telepresence , 2010, 2010 IEEE International Conference on Systems, Man and Cybernetics.

[2]  Robert Leeb,et al.  Towards natural non-invasive hand neuroprostheses for daily living , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[3]  G. Pfurtscheller,et al.  EEG-based neuroprosthesis control: A step towards clinical practice , 2005, Neuroscience Letters.

[4]  Hendrik Van Brussel,et al.  Shared control for intelligent wheelchairs: an implicit estimation of the user intention , 2003 .

[5]  Paul C. Schutte,et al.  The H-Metaphor as a Guideline for Vehicle Automation and Interaction , 2005 .

[6]  Yiannis Demiris,et al.  Human-wheelchair collaboration through prediction of intention and adaptive assistance , 2008, 2008 IEEE International Conference on Robotics and Automation.

[7]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[8]  M. Nuttin,et al.  A brain-actuated wheelchair: Asynchronous and non-invasive Brain–computer interfaces for continuous control of robots , 2008, Clinical Neurophysiology.

[9]  David M. Santucci,et al.  Learning to Control a Brain–Machine Interface for Reaching and Grasping by Primates , 2003, PLoS biology.

[10]  Andrew S. Whitford,et al.  Cortical control of a prosthetic arm for self-feeding , 2008, Nature.

[11]  Christian Laugier,et al.  Controlling a Wheelchair Indoors Using Thought , 2007, IEEE Intelligent Systems.

[12]  M. Nuttin,et al.  Asynchronous non-invasive brain-actuated control of an intelligent wheelchair , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[13]  Ricardo Chavarriaga,et al.  Non-Invasive Brain-Machine Interaction , 2008, Int. J. Pattern Recognit. Artif. Intell..

[14]  Gregor Schöner,et al.  Dynamics of behavior: Theory and applications for autonomous robot architectures , 1995, Robotics Auton. Syst..

[15]  Iñaki Iturrate,et al.  A Noninvasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation , 2009, IEEE Transactions on Robotics.