Grasping with Your Brain: A Brain-Computer Interface for Fast Grasp Selection

Brain-Computer Interfaces are promising technologies that can improve Human-Robot Interaction, especially for disabled and impaired individuals. Non-invasive BCI’s, which are very desirable from a medical and therapeutic perspective, are only able to deliver noisy, low-bandwidth signals, making their use in complex tasks difficult. To this end, we present a shared control online grasp planning framework using an advanced EEG-based interface. Unlike commonly used paradigms, the EEG interface we incorporate allows online generation of a flexible number of options. This online planning framework allows the user to direct the planner towards grasps that reflect their intent for using the grasped object by successively selecting grasps that approach the desired approach direction of the hand. The planner divides the grasping task into phases, and generates images that reflect the choices that the planner can make at each phase. The EEG interface is used to recognize the user’s preference among a set of options presented by the planner. The EEG signal classifier is fast and simple to train, and the system as a whole requires almost no learning on the part of the subject. Three subjects were able to successfully use the system to grasp and pick up a number of objects in a cluttered scene.

[1]  P. Sajda,et al.  Cortically coupled computer vision for rapid image search , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[2]  Martial Hebert,et al.  Autonomy Infused Teleoperation with Application to BCI Manipulation , 2015, Robotics: Science and Systems.

[3]  Matei T. Ciocarlie,et al.  Hand Posture Subspaces for Dexterous Robotic Grasping , 2009, Int. J. Robotics Res..

[4]  Shih-Fu Chang,et al.  Closing the loop in cortically-coupled computer vision: a brain–computer interface for searching image databases , 2011, Journal of neural engineering.

[5]  Tonio Ball,et al.  A brain-computer interface for high-level remote control of an autonomous, reinforcement-learning-based robotic system for reaching and grasping , 2014, IUI.

[6]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[7]  John D. Simeral,et al.  Continuous Control of the DLR Light-Weight Robot III by a Human with Tetraplegia Using the BrainGate2 Neural Interface System , 2010, ISER.

[8]  Christa Neuper,et al.  Combined motor imagery and SSVEP based BCI control of a 2 DoF artificial upper limb , 2011, Medical & Biological Engineering & Computing.

[9]  Bin He,et al.  Goal selection versus process control while learning to use a brain–computer interface , 2011, Journal of neural engineering.

[10]  Peter K. Allen,et al.  Single muscle site sEMG interface for assistive grasping , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Peter K. Allen,et al.  A user interface for assistive grasping , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Claus Bahlmann,et al.  In a Blink of an Eye and a Switch of a Transistor: Cortically Coupled Computer Vision , 2010, Proceedings of the IEEE.

[13]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.

[14]  Matei T. Ciocarlie,et al.  Biomimetic grasp planning for cortical control of a robotic hand , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Isbn,et al.  What You Think Is What You Get , 2004 .

[16]  Peter K. Allen,et al.  Grasping with Your Face , 2012, ISER.

[17]  John F. Canny,et al.  Planning optimal grasps , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[18]  Nicholas Waytowich,et al.  Robot application of a brain computer interface to staubli TX40 robots - early stages , 2010, 2010 World Automation Congress.

[19]  Doru Talaba,et al.  Controlling a Robotic Arm by Brainwaves and Eye Movement , 2011, DoCEIS.