Automatic extraction of command hierarchies for adaptive brain-robot interfacing

Recent advances in neuroscience and robotics have allowed initial demonstrations of brain-computer interfaces (BCIs) for controlling wheeled and humanoid robots. However, further advances have proved challenging due to the low throughput of the interfaces and the high degrees-of-freedom (DOF) of the robots. In this paper, we build on our previous work on Hierarchical BCIs (HBCIs) which seek to mitigate this problem. We extend HBCIs to allow training of arbitrarily complex tasks, with training no longer restricted to a particular robot state space (such as Cartesian space for a navigation task). We present two algorithms for learning command hierarchies by automatically extracting patterns from a user's command history. The first algorithm builds an arbitrary-level hierarchical structure (a “control grammar”) whose elements can represent skills, whole tasks, collections of tasks, etc. The user “executes” single symbols from this grammar, which produce sequences of lower-level commands. The second algorithm, which is probabilistic, also learns sequences which can be executed as high-level commands, but does not build an explicit hierarchical structure. Both algorithms provide a de facto form of dictionary compression, which enhances the effective throughput of the BCI. We present results from two human subjects who successfully used the hierarchical BCI to control a simulated PR2 robot using brain signals recorded non-invasively through electroencephalography (EEG).

[1]  Fredrik Rehnmark,et al.  Robonaut: NASA's Space Humanoid , 2000, IEEE Intell. Syst..

[2]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[3]  Jaeseung Jeong,et al.  Brain-actuated humanoid robot navigation control using asynchronous Brain-Computer Interface , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[4]  Rajesh P. N. Rao,et al.  A Hierarchical Architecture for Adaptive Brain-Computer Interfacing , 2011, IJCAI.

[5]  Rajesh P. N. Rao,et al.  Interactive Hierarchical Brain-Computer Interfacing: Uncertainty-Based Interaction between Humans and Robots , 2011 .

[6]  R. Scherer,et al.  Towards hierarchical BCIs for robotic control , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[7]  Gernot R. Müller-Putz,et al.  Control of an Electrical Prosthesis With an SSVEP-Based BCI , 2008, IEEE Transactions on Biomedical Engineering.

[8]  Odest Chadwicke Jenkins Sparse control for high-DOF assistive robots , 2008, Intell. Serv. Robotics.

[9]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.

[10]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[11]  Takashi Nishiyama,et al.  Development of user interface for humanoid service robot system , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[12]  Ian H. Witten,et al.  Identifying Hierarchical Structure in Sequences: A linear-time algorithm , 1997, J. Artif. Intell. Res..

[13]  Rajesh P. N. Rao,et al.  "Social" robots are psychological agents for infants: A test of gaze following , 2010, Neural Networks.

[14]  Rajesh P. N. Rao,et al.  An adaptive brain-computer interface for humanoid robot control , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[15]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.