Assistive grasping with an augmented reality user interface

Assisting impaired individuals with robotic devices is an emerging and potentially transformative technology. This paper describes the design of an assistive robotic grasping system that allows impaired individuals to interact with the system in a human-in-the-loop manner, including the use of a novel cranio-facial electromyography input device. The system uses an augmented reality interface that allows users to plan grasps online that match their task-oriented intents. The system uses grasp quality measurements that generate more robust grasps by considering the local geometry of the object and the effect of uncertainty during grasp acquisition. This interface is validated by testing with real users, both healthy and impaired. This work forms the foundation for a flexible, fully featured human-in-the-loop system that allows users to grasp known and unknown objects in cluttered spaces using novel, practical human–robot interaction paradigms that have the potential to bring human-in-the-loop assistive devices out of the research environment and into the lives of those that need them.

[1]  T. Shanmugapriya,et al.  Robot Application of a Brain Computer Interface To Staubli Tx 40 Robots-Early Stages , 2014 .

[2]  Nicholas Waytowich,et al.  Robot application of a brain computer interface to staubli TX40 robots - early stages , 2010, 2010 World Automation Congress.

[3]  K. Y. Tong,et al.  An EMG-driven exoskeleton hand robotic training device on chronic stroke subjects: Task training system for stroke rehabilitation , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[4]  Panagiotis K. Artemiadis,et al.  A Switching Regime Model for the EMG-Based Control of a Robot Arm , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[5]  Peter K. Allen,et al.  Graspit! A versatile simulator for robotic grasping , 2004, IEEE Robotics & Automation Magazine.

[6]  Peter K. Allen,et al.  Single muscle site sEMG interface for assistive grasping , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Hong Liu,et al.  EMG pattern recognition and grasping force estimation: Improvement to the myocontrol of multi-DOF prosthetic hands , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Doru Talaba,et al.  Controlling a Robotic Arm by Brainwaves and Eye Movement , 2011, DoCEIS.

[9]  Marek Kurzynski,et al.  Human-machine interface in bioprosthesis control using EMG signal classification , 2010, Expert Syst. J. Knowl. Eng..

[10]  Peter K. Allen,et al.  Pose error robust grasping from contact wrench space metrics , 2012, 2012 IEEE International Conference on Robotics and Automation.

[11]  Koichi Sagawa,et al.  Control of robot manipulator using EMG generated from face , 2005, ICMIT: Mechatronics and Information Technology.

[12]  Siddhartha S. Srinivasa,et al.  Legibility and predictability of robot motion , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Godfrey Cm A Russian Bioelectric-Controlled Prosthesis , 1965 .

[14]  Rajesh P. N. Rao,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. 1 Online Electromyographic Control of a Robotic , 2022 .

[15]  Ida-Maria Skavhaug,et al.  Learning to modulate the partial powers of a single sEMG power spectrum through a novel human-computer interface. , 2016, Human movement science.

[16]  Darius Burschka,et al.  An Efficient RANSAC for 3D Object Recognition in Noisy and Occluded Scenes , 2010, ACCV.

[17]  S. S. Joshi,et al.  Brain–Muscle–Computer Interface: Mobile-Phone Prototype Development and Testing , 2011, IEEE Transactions on Information Technology in Biomedicine.

[18]  S. S. Joshi,et al.  Brain-muscle-computer interface using a single surface electromyographic signal: Initial results , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[19]  Siddhartha S. Srinivasa,et al.  Addressing pose uncertainty in manipulation planning using Task Space Regions , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Editedby Eleanor Criswell,et al.  Cram's Introduction to Surface Electromyography , 2010 .

[21]  Patrick van der Smagt,et al.  Surface EMG in advanced hand prosthetics , 2008, Biological Cybernetics.

[22]  Sanjay S Joshi,et al.  Two-Dimensional Cursor-to-Target Control From Single Muscle Site sEMG Signals , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[23]  Silvestro Micera,et al.  On the Shared Control of an EMG-Controlled Prosthetic Hand: Analysis of User–Prosthesis Interaction , 2008, IEEE Transactions on Robotics.

[24]  Matei T. Ciocarlie,et al.  Hand Posture Subspaces for Dexterous Robotic Grasping , 2009, Int. J. Robotics Res..

[25]  E D Sherman,et al.  A Russian Bioelectric-Controlled Prosthesis. , 1965, Canadian Medical Association journal.

[26]  Peter K. Allen,et al.  A user interface for assistive grasping , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Siddhartha S. Srinivasa,et al.  Manipulation planning on constraint manifolds , 2009, 2009 IEEE International Conference on Robotics and Automation.

[28]  Martial Hebert,et al.  Autonomy Infused Teleoperation with Application to BCI Manipulation , 2015, Robotics: Science and Systems.

[29]  G. Pfurtscheller,et al.  EEG-based neuroprosthesis control: A step towards clinical practice , 2005, Neuroscience Letters.

[30]  Ida-Maria Skavhaug,et al.  Pilot study for a Brain-Muscle-Computer Interface using the Extensor Pollicis Longus with preselected frequency bands , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[31]  S. Adamovich,et al.  Analysis of a commercial EEG device for the control of a robot arm , 2010, Proceedings of the 2010 IEEE 36th Annual Northeast Bioengineering Conference (NEBEC).

[32]  John D. Simeral,et al.  Continuous Control of the DLR Light-Weight Robot III by a Human with Tetraplegia Using the BrainGate2 Neural Interface System , 2010, ISER.

[33]  Christa Neuper,et al.  Combined motor imagery and SSVEP based BCI control of a 2 DoF artificial upper limb , 2011, Medical & Biological Engineering & Computing.

[34]  Lydia E. Kavraki,et al.  The Open Motion Planning Library , 2012, IEEE Robotics & Automation Magazine.

[35]  G. Magenes,et al.  Two-channel real-time EMG control of a dexterous hand prosthesis , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[36]  Peter K. Allen,et al.  Grasping with Your Face , 2012, ISER.

[37]  G R McMillan,et al.  Preliminary electrophysiological characterization of functionally vestigial muscles of the head: potential for command signaling. , 1999, The journal of spinal cord medicine.

[38]  Jaime Gómez Gil,et al.  Steering a Tractor by Means of an EMG-Based Human-Machine Interface , 2011, Sensors.

[39]  Bin He,et al.  Goal selection versus process control while learning to use a brain–computer interface , 2011, Journal of neural engineering.

[40]  Siddhartha S. Srinivasa,et al.  Shared Autonomy via Hindsight Optimization , 2015, Robotics: Science and Systems.

[41]  Brendan Z. Allison,et al.  Non-invasive Brain-Computer Interfaces: Enhanced Gaming and Robotic Control , 2011, IWANN.

[42]  Isbn,et al.  What You Think Is What You Get , 2004 .

[43]  B. Faverjon,et al.  Probabilistic Roadmaps for Path Planning in High-Dimensional Con(cid:12)guration Spaces , 1996 .

[44]  Rajesh P. N. Rao,et al.  Control of a humanoid robot by a noninvasive brain–computer interface in humans , 2008, Journal of neural engineering.

[45]  Peter K. Allen,et al.  Generating multi-fingered robotic grasps via deep learning , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).