A Shared-Autonomy Approach to Goal Detection and Navigation Control of Mobile Collaborative Robots

Autonomous goal detection and navigation control of mobile robots in remote environments can help to unload human operators from simple, monotonous tasks allowing them to focus on more cognitively stimulating actions. This can result in better task performances, while creating user-interfaces that are understandable by non-experts. However, full autonomy in unpredictable and dynamically changing environments is still far from becoming a reality. Thus, teleoperated systems integrating the supervisory role and instantaneous decision-making capacity of humans are still required for fast and reliable robotic operations. This work presents a novel shared-autonomy framework for goal detection and navigation control of mobile manipulators. The controller exploits human-gaze information to estimate the desired goal. This is used together with control-pad data to predict user intention, and to activate the autonomous control for executing a target task. Using the control-pad device, a user can react to unexpected disturbances and halt the autonomous mode at any time. By releasing the control-pad device (e.g., after avoiding an instantaneous obstacle) the controller smoothly switches back to the autonomous mode and navigates the robot towards the target. Experiments for reaching a target goal in the presence of unknown obstacles are carried out to evaluate the performance of the proposed shared-autonomy framework over seven subjects. The results prove the accuracy, time-efficiency, and ease-of-use of the presented shared-autonomy control framework.

[1]  Jee-Hwan Ryu,et al.  Development of the human interactive autonomy for the shared teleoperation of mobile robots , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[2]  Yang Liu,et al.  Vision-based predictive assist control on master-slave systems , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Manuel G. Catalano,et al.  Shared-Autonomy Control for Intuitive Bimanual Tele-Manipulation , 2018, 2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids).

[4]  Siddhartha S. Srinivasa,et al.  Teleoperation with intelligent and customizable interfaces , 2013, Journal of Human-Robot Interaction.

[5]  Manuel G. Catalano,et al.  From Soft to Adaptive Synergies: The Pisa/IIT SoftHand , 2016, Human and Robot Hands.

[6]  Iraj Kalantari,et al.  A Data Structure and an Algorithm for the Nearest Point Problem , 1983, IEEE Transactions on Software Engineering.

[7]  Klaus Landzettel,et al.  Multisensory shared autonomy and tele-sensor-programming-Key issues in the space robot technology experiment ROTEX , 1993, Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93).

[8]  Mark Whitty,et al.  Robotics, Vision and Control. Fundamental Algorithms in MATLAB , 2012 .

[9]  Peter Corcoran,et al.  A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms , 2017, IEEE Access.

[10]  Thomas B. Sheridan Teleoperation, Telerobotics, and Telepresence: A Progress Report , 1992 .

[11]  Jörn Malzahn,et al.  WALK‐MAN: A High‐Performance Humanoid Platform for Realistic Environments , 2017, J. Field Robotics.

[12]  Russell H. Taylor,et al.  Medical robotics in computer-integrated surgery , 2003, IEEE Trans. Robotics Autom..

[13]  Paolo Fiorini,et al.  Medical Robotics and Computer-Integrated Surgery , 2008, 2008 32nd Annual IEEE International Computer Software and Applications Conference.

[14]  J. Goodier The Concise Encyclopedia of Statistics , 2009 .

[15]  Gregory Z. Grudic,et al.  Coping with imbalanced training data for improved terrain prediction in autonomous outdoor robot navigation , 2010, 2010 IEEE International Conference on Robotics and Automation.

[16]  F. Wilcoxon Individual Comparisons by Ranking Methods , 1945 .

[17]  Elena De Momi,et al.  A Collaborative Robotic Approach to Gaze-Based Upper-Limb Assisted Reaching , 2019, 2019 IEEE International Work Conference on Bioinspired Intelligence (IWOBI).

[18]  Siddhartha S. Srinivasa,et al.  Predicting User Intent Through Eye Gaze for Shared Autonomy , 2016, AAAI Fall Symposia.

[19]  Klaus Landzettel,et al.  DLR's Advanced Telerobotic Concepts and Experiments for On-Orbit Servicing , 2007, Advances in Telerobotics.

[20]  Tamim Asfour,et al.  A whole-body pose taxonomy for loco-manipulation tasks , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[21]  Mark W. Spong,et al.  Bilateral teleoperation: An historical survey , 2006, Autom..

[22]  Anca D. Dragan,et al.  Shared Autonomy via Deep Reinforcement Learning , 2018, Robotics: Science and Systems.

[23]  Agostino Gibaldi,et al.  Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research , 2016, Behavior Research Methods.

[24]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[25]  Liu Yang,et al.  Assistance for Master-Slave System for Objects of Various Shapes by Eye Gaze Tracking and Motion Prediction , 2018, 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[26]  Fei Zhao,et al.  A Teleoperation Interface for Loco-Manipulation Control of Mobile Collaborative Robotic Assistant , 2019, IEEE Robotics and Automation Letters.

[27]  Heping Chen,et al.  Topological Indoor Localization & Navigation for Autonomous Industrial Mobile Manipulator , 2012, 2012 11th International Conference on Machine Learning and Applications.