An adaptive brain-computer interface for humanoid robot control

Recent advances in neuroscience and humanoid robotics have allowed initial demonstrations of brain-computer interfaces (BCIs) for controlling humanoid robots. However, previous BCIs have relied on higher-level control based on fixed pre-wired behaviors. On the other hand, low-level control can be tedious, imposing a high cognitive load on the BCI user. To address these problems, we previously proposed an adaptive hierarchical approach to brain-computer interfacing: users teach the BCI system new skills on-the-fly; these skills can later be invoked directly as high-level commands, relieving the user of tedious control. In this paper, we explore the application of hierarchical BCIs to the task of controlling a PR2 humanoid robot and teaching it new skills. We further explore the use of explicitly-defined sequences of commands as a way for the user to define a more complex task involving multiple state spaces. We report results from three subjects who used a hierarchical electroencephalogram (EEG)-based BCI to successfully train and control the PR2 humanoid robot in a simulated household task maneuvering the robot's arm to pour milk over a bowl of cereal. We present the first demonstration of training a hierarchical BCI for a non-navigational task. This is also the first demonstration of using one to train a more complex task involving multiple state spaces.

[1]  Takashi Nishiyama,et al.  Development of user interface for humanoid service robot system , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[2]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[3]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[4]  Rajesh P. N. Rao,et al.  A Hierarchical Architecture for Adaptive Brain-Computer Interfacing , 2011, IJCAI.

[5]  Rajesh P. N. Rao,et al.  "Social" robots are psychological agents for infants: A test of gaze following , 2010, Neural Networks.

[6]  Fredrik Rehnmark,et al.  Robonaut: NASA's Space Humanoid , 2000, IEEE Intell. Syst..

[7]  Gernot R. Müller-Putz,et al.  Control of an Electrical Prosthesis With an SSVEP-Based BCI , 2008, IEEE Transactions on Biomedical Engineering.

[8]  Jaeseung Jeong,et al.  Brain-actuated humanoid robot navigation control using asynchronous Brain-Computer Interface , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[9]  Andrew S. Whitford,et al.  Cortical control of a prosthetic arm for self-feeding , 2008, Nature.

[10]  Tetsuo Ono,et al.  Android as a telecommunication medium with a human-like presence , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Masayuki Inaba,et al.  Design and implementation of remotely operation interface for humanoid robot , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[12]  Kazuhito Yokoi,et al.  Whole body teleoperation of a humanoid robot - development of a simple master device using joysticks , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Satoshi Kagami,et al.  An intelligent joystick for biped control , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[14]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.