An integrated framework for humanoid embodiment with a BCI

This paper presents a framework to embody a user (e.g. disabled persons) into a humanoid robot controlled by means of brain-computer interfaces (BCI). With our framework, the robot can interact with the environment, or assist its user. The low frequency and accuracy of the BCI commands is compensated by vision tools, such as objects recognition and mapping techniques, as well as shared-control approaches. As a result, the proposed framework offers intuitive, safe, and accurate robot navigation towards an object or a person. The generic aspect of the framework is demonstrated by two complex experiments, where the user controls the robot to serve him a drink, and to raise his own arm.

[1]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  J J Vidal,et al.  Toward direct brain-computer communication. , 1973, Annual review of biophysics and bioengineering.

[3]  Andrew I. Comport,et al.  On unifying key-frame and voxel-based dense visual SLAM at large scales , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Peter K. Allen,et al.  A user interface for assistive grasping , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  A. Akce,et al.  A Brain–Machine Interface to Navigate a Mobile Robot in a Planar Workspace: Enabling Humans to Fly Simulated Aircraft With EEG , 2013, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  Marilena Vendittelli,et al.  Vision-based corridor navigation for humanoid robots , 2013, 2013 IEEE International Conference on Robotics and Automation.

[7]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.

[8]  Abderrahmane Kheddar,et al.  Navigation assistance for a BCI-controlled humanoid robot , 2014, The 4th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent.

[9]  Andrei Herdt,et al.  Online Walking Motion Generation with Automatic Footstep Placement , 2010, Adv. Robotics.

[10]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[11]  Bin He,et al.  Goal selection versus process control in a brain–computer interface based on sensorimotor rhythms , 2009, Journal of neural engineering.

[12]  Dieter Schmalstieg,et al.  Real-Time Detection and Tracking for Augmented Reality on Mobile Phones , 2010, IEEE Transactions on Visualization and Computer Graphics.

[13]  Nico Blodow,et al.  Fast Point Feature Histograms (FPFH) for 3D registration , 2009, 2009 IEEE International Conference on Robotics and Automation.

[14]  Horst-Michael Groß,et al.  A mobile robot platform for socially assistive home-care applications , 2012, ROBOTIK.

[15]  M. Nuttin,et al.  A brain-actuated wheelchair: Asynchronous and non-invasive Brain–computer interfaces for continuous control of robots , 2008, Clinical Neurophysiology.

[16]  Carlos Canudas de Wit,et al.  Theory of Robot Control , 1996 .

[17]  Olivier Stasse,et al.  A versatile Generalized Inverted Kinematics implementation for collaborative working humanoid robots: The Stack Of Tasks , 2009, ICAR.

[18]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..

[19]  Abderrahmane Kheddar,et al.  Multitask Humanoid Control with a Brain-Computer Interface: User Experiment with HRP-2 , 2012, ISER.

[20]  Sungho Jo,et al.  A Low-Cost EEG System-Based Hybrid Brain-Computer Interface for Humanoid Robot Navigation and Recognition , 2013, PloS one.

[21]  Carlos Canudas de Wit,et al.  Nonlinear feedback control , 1990 .

[22]  Hande Turker,et al.  Electrodiagnosis in New Frontiers of Clinical Research , 2013 .

[23]  M. Vincze,et al.  BLORT-The Blocks World Robotic Vision Toolbox , 2010 .

[24]  G. Essick,et al.  Quantitative assessment of pleasant touch , 2010, Neuroscience & Biobehavioral Reviews.