Non-invasive Cognitive-level Human Interfacing for the Robotic Restoration of Reaching & Grasping

Assistive and Wearable Robotics have the potential to support humans with different types of motor impairments to become independent and fulfil their activities of daily living successfully. The success of these robot systems, however, relies on the ability to meaningfully decode human action intentions and carry them out appropriately. Neural interfaces have been explored for use in such system with several successes, however, they tend to be invasive and require training periods in the order of months. We present a robotic system for human augmentation, capable of actuating the user's arm and fingers for them, effectively restoring the capability of reaching, grasping and manipulating objects; controlled solely through the user's eye movements. We combine wearable eye tracking, the visual context of the environment and the structural grammar of human actions to create a cognitive-level assistive robotic setup that enables the users in fulfilling activities of daily living, while conserving interpretability, and the agency of the user. The interface is worn, calibrated and ready to use within 5 minutes. Users learn to control and make successful use of the system with an additional 5 minutes of interaction. The system is tested with 5 healthy participants, showing an average success rate of 96.6% on first attempt across 6 tasks.

[1]  J. Moreno,et al.  The H2 robotic exoskeleton for gait rehabilitation after stroke: early findings from a clinical study , 2015, Journal of NeuroEngineering and Rehabilitation.

[2]  Constantinos Gavriel,et al.  Robust, ultra low-cost MMG system with brain-machine-interface applications , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[3]  William W. Abbott,et al.  Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[4]  A. Schwartz,et al.  High-performance neuroprosthetic control by an individual with tetraplegia , 2013, The Lancet.

[5]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[6]  A. Aldo Faisal,et al.  “Wink to grasp” — comparing eye, voice & EMG gesture control of grasp with soft-robotic gloves , 2017, 2017 International Conference on Rehabilitation Robotics (ICORR).

[7]  Francis R. Willett,et al.  Restoration of reaching and grasping in a person with tetraplegia through brain-controlled muscle stimulation: a proof-of-concept demonstration , 2017, The Lancet.

[8]  A. Aldo Faisal,et al.  Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking , 2017, 2017 International Conference on Rehabilitation Robotics (ICORR).

[9]  R. Andersen,et al.  Cognitive Control Signals for Neural Prosthetics , 2004, Science.

[10]  L. Itti,et al.  Defending Yarbus: eye movements reveal observers' task. , 2014, Journal of vision.

[11]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[12]  A. Aldo Faisal,et al.  Semantic fovea: real-time annotation of ego-centric videos with gaze context , 2018, ETRA.

[13]  T. Kuiken,et al.  Myoelectric Pattern Recognition Outperforms Direct Control for Transhumeral Amputees with Targeted Muscle Reinnervation: A Randomized Clinical Trial , 2017, Scientific Reports.

[14]  Reuben M. Aronson,et al.  Eye Gaze for Assistive Manipulation , 2020, HRI.

[15]  Dario Farina,et al.  The Extraction of Neural Information from the Surface EMG for the Control of Upper-Limb Prostheses: Emerging Avenues and Challenges , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[16]  A. Aldo Faisal,et al.  Gaze-based, Context-aware Robotic System for Assisted Reaching and Grasping , 2018, 2019 International Conference on Robotics and Automation (ICRA).

[17]  Nicolas Y. Masse,et al.  Reach and grasp by people with tetraplegia using a neurally controlled robotic arm , 2012, Nature.

[18]  Nigel Sim,et al.  The head mouse — Head gaze estimation "In-the-Wild" with low-cost inertial sensors for BMI use , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[19]  George P. Mylonas,et al.  Gaze-contingent control for minimally invasive robotic surgery , 2006, Computer aided surgery : official journal of the International Society for Computer Aided Surgery.

[20]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.