Gaze-based, Context-aware Robotic System for Assisted Reaching and Grasping

Assistive robotic systems endeavour to support those with movement disabilities, enabling them to move again and regain functionality. Main issue with these systems is the complexity of their low-level control, and how to translate this to simpler, higher level commands that are easy and intuitive for a human user to interact with. We have created a multi-modal system, consisting of different sensing, decision making and actuating modalities, leading to intuitive, human-in-the-loop assistive robotics. The system takes its cue from the user’s gaze, to decode their intentions and implement low-level motion actions to achieve high-level tasks. This results in the user simply having to look at the objects of interest, for the robotic system to assist them in reaching for those objects, grasping them, and using them to interact with other objects. We present our method for 3D gaze estimation, and grammars-based implementation of sequences of action with the robotic system. The 3D gaze estimation is evaluated with 8 subjects, showing an overall accuracy of 4.68\pm 0.14cm. The full system is tested with 5 subjects, showing successful implementation of 100% of reach to gaze point actions and full implementation of pick and place tasks in 96%, and pick and pour tasks in 76% of cases. Finally we present a discussion on our results and what future work is needed to improve the system.

[1]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[2]  David Whalen,et al.  Jamster: A mobile dual-arm assistive robot with Jamboxx control , 2014, 2014 IEEE International Conference on Automation Science and Engineering (CASE).

[3]  George P. Mylonas,et al.  Free-View, 3D Gaze-Guided, Assistive Robotic System for Activities of Daily Living , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  Ricardo Carelli,et al.  Towards a New Modality-Independent Interface for a Robotic Wheelchair , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[5]  Hao He,et al.  Real-Time Eye-Gaze Based Interaction for Human Intention Prediction and Emotion Analysis , 2018, CGI 2018.

[6]  A. Aldo Faisal,et al.  A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving , 2015, 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER).

[7]  D.J. Reinkensmeyer,et al.  Automating Arm Movement Training Following Severe Stroke: Functional Exercises With Quantitative Feedback in a Gravity-Reduced Environment , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[8]  Wanda Benesova,et al.  Visual attention in egocentric field-of-view using RGB-D data , 2017, International Conference on Machine Vision.

[9]  Yiannis Demiris,et al.  Using Visual Attention to Evaluate Collaborative Control Architectures for Human Robot Interaction , 2009, HRI 2009.

[10]  Quanlin Li,et al.  Eye Gaze Tracking Based Interaction Method of an Upper-Limb Exoskeletal Rehabilitation Robot , 2017, ICIRA.

[11]  Giorgio Grioli,et al.  Analytical and Experimental Analysis for Position Optimization of a Grasp Assistance Supernumerary Robotic Hand , 2018, IEEE Robotics and Automation Letters.

[12]  H. Harry Asada,et al.  “Hold-and-manipulate” with a single hand being assisted by wearable extra fingers , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[13]  William W. Abbott,et al.  3D gaze cursor: Continuous calibration and end-point grasp control of robotic actuators , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[14]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.

[15]  E. Rocon,et al.  Design and Validation of a Rehabilitation Robotic Exoskeleton for Tremor Assessment and Suppression , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[16]  A. Aldo Faisal,et al.  The Supernumerary Robotic 3rd Thumb for Skilled Music Tasks , 2018, 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob).

[17]  Brian Scassellati,et al.  A Context-Dependent Attention System for a Social Robot , 1999, IJCAI.

[18]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[19]  Hirotake Yamazoe,et al.  Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking , 2007, ICMI '07.

[20]  J. Moreno,et al.  The H2 robotic exoskeleton for gait rehabilitation after stroke: early findings from a clinical study , 2015, Journal of NeuroEngineering and Rehabilitation.

[21]  A. Aldo Faisal,et al.  Semantic fovea: real-time annotation of ego-centric videos with gaze context , 2018, ETRA.

[22]  Sungho Jo,et al.  Electric wheelchair control using head pose free eye-gaze tracker , 2012 .

[23]  D. Venkataraman,et al.  Eye movement based electronic wheel chair for physically challenged persons , 2014 .

[24]  Silvestro Micera,et al.  On the Shared Control of an EMG-Controlled Prosthetic Hand: Analysis of User–Prosthesis Interaction , 2008, IEEE Transactions on Robotics.

[25]  Guang-Zhong Yang,et al.  Gaze gesture based human robot interaction for laparoscopic surgery , 2018, Medical Image Anal..

[26]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[27]  Martial Hebert,et al.  Autonomy infused teleoperation with application to brain computer interface controlled manipulation , 2017, Autonomous Robots.

[28]  William W. Abbott,et al.  Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[29]  Domenico Prattichizzo,et al.  The Soft-SixthFinger: a Wearable EMG Controlled Robotic Extra-Finger for Grasp Compensation in Chronic Stroke Patients , 2016, IEEE Robotics and Automation Letters.

[30]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[31]  Francis R. Willett,et al.  Restoration of reaching and grasping in a person with tetraplegia through brain-controlled muscle stimulation: a proof-of-concept demonstration , 2017, The Lancet.

[32]  Kevin B. Englehart,et al.  A robust, real-time control scheme for multifunction myoelectric control , 2003, IEEE Transactions on Biomedical Engineering.

[33]  Steven M Chase,et al.  Intracortical recording stability in human brain–computer interface users , 2018, Journal of neural engineering.