TeMoto: Intuitive Multi-Range Telerobotic System with Natural Gestural and Verbal Instruction Interface

Teleoperated mobile robots, equipped with object manipulation capabilities, provide safe means for executing dangerous tasks in hazardous environments without putting humans at risk. However, mainly due to a communication delay, complex operator interfaces and insufficient Situational Awareness (SA), the task productivity of telerobots remains inferior to human workers. This paper addresses the shortcomings of telerobots by proposing a combined approach of (i) a scalable and intuitive operator interface with gestural and verbal input, (ii) improved Situational Awareness (SA) through sensor fusion according to documented best practices, (iii) integrated virtual fixtures for task simplification and minimizing the operator’s cognitive burden and (iv) integrated semiautonomous behaviors that further reduce cognitive burden and negate the impact of communication delays, execution latency and/or failures. The proposed teleoperation system, TeMoto, is implemented using ROS (Robot Operating System) to ensure hardware agnosticism, extensibility and community access. The operator’s command interface consists of a Leap Motion Controller for hand tracking, Griffin PowerMate USB as turn knob for scaling and a microphone for speech input. TeMoto is evaluated on multiple robots including two mobile manipulator platforms. In addition to standard, task-specific evaluation techniques (completion time, user studies, number of steps, etc.)—which are platform and task dependent and thus difficult to scale—this paper presents additional metrics for evaluating the user interface including task-independent criteria for measuring generalized (i) task completion efficiency and (ii) operator context switching.

[1]  Du Guanglong,et al.  Markerless Kinect-Based Hand Tracking for Robot Teleoperation , 2012 .

[2]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[3]  Geet Krishna Capoor,et al.  Gesture control of drone using a motion controller , 2016, 2016 International Conference on Industrial Informatics and Computer Systems (CIICS).

[4]  Jörg Stückler,et al.  Adjustable autonomy for mobile teleoperation of personal service robots , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[5]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[6]  Karl Kruusamäe,et al.  Semiautonomous dual-arm mobile manipulator system with intuitive supervisory user interfaces , 2017, 2017 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO).

[7]  Mitch Pryor,et al.  Spatial interface for user-centered robotic teleoperation , 2014 .

[8]  Alexander I. Rudnicky,et al.  Pocketsphinx: A Free, Real-Time Continuous Speech Recognition System for Hand-Held Devices , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[9]  Matei T. Ciocarlie,et al.  Interactive Markers: 3-D User Interfaces for ROS Applications [ROS Topics] , 2011, IEEE Robotics Autom. Mag..

[10]  Holly A. Yanco,et al.  Analysis of Human‐robot Interaction at the DARPA Robotics Challenge Trials , 2015, J. Field Robotics.

[11]  Andre Schneider de Oliveira,et al.  ROS Navigation: Concepts and Tutorial , 2016 .

[12]  Allison M. Okamura,et al.  Haptic Virtual Fixtures for Robot-Assisted Manipulation , 2005, ISRR.

[13]  Scott Devine,et al.  Real time robotic arm control using hand gestures with multiple end effectors , 2016, 2016 UKACC 11th International Conference on Control (CONTROL).

[14]  Karl Kruusamäe,et al.  High-precision telerobot with human-centered variable perspective and scalable gestural interface , 2016, 2016 9th International Conference on Human System Interactions (HSI).

[15]  Lydia E. Kavraki,et al.  The Open Motion Planning Library , 2012, IEEE Robotics & Automation Magazine.

[16]  V. Ozben,et al.  Robotic-Assisted Minimally Invasive Surgery , 2019, Springer International Publishing.

[17]  Mengyin Fu,et al.  Hand Gesture Based Robot Control System Using Leap Motion , 2015, ICIRA.

[18]  Atsushi Yamashita,et al.  Development of the remote-controlled platform truck for handling heavy object and the remote control human interface for the disaster response , 2015, Adv. Robotics.

[19]  Heath A. Ruff,et al.  Manual Versus Speech Input for Unmanned Aerial Vehicle Control Station Operations , 2003 .

[20]  V. Zue Eighty Challenges Facing Speech Input/Output Technologies , 2004 .

[21]  Kazuya Yoshida,et al.  Emergency response to the nuclear accident at the Fukushima Daiichi Nuclear Power Plants using mobile rescue robots , 2013, J. Field Robotics.

[22]  Ronald L. Boring,et al.  Shared understanding for collaborative control , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[23]  Blake Hannaford,et al.  Forbidden region virtual fixtures from streaming point clouds , 2014, Adv. Robotics.

[24]  Thomas B. Sheridan,et al.  Space teleoperation through time delay: review and prognosis , 1993, IEEE Trans. Robotics Autom..

[25]  M. R. Wrock,et al.  An automatic switching approach to teleoperation of mobile-manipulator systems using virtual fixtures , 2017, Robotica.

[26]  J. Edward Colgate,et al.  Lessons learned from a novel teleoperation testbed , 2006, Ind. Robot.

[27]  Meike Jipp,et al.  Levels of automation: effects of individual differences on wheelchair control performance and user acceptance , 2014 .

[28]  Robin R. Murphy,et al.  Survey of metrics for human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[29]  Juan P. Wachs,et al.  A comparative study for telerobotic surgery using free hand gestures , 2016 .

[30]  Rafael Aracil,et al.  Stereoscopic human interfaces , 2008, IEEE Robotics & Automation Magazine.

[31]  Young Soo Park,et al.  Simulation of Augmented Telerobotic Operation , 2014, 2014 International Symposium on Optomechatronic Technologies.

[32]  Jessie Y. C. Chen,et al.  Human Performance Issues and User Interface Design for Teleoperated Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[33]  Louis B. Rosenberg,et al.  Virtual fixtures: Perceptual tools for telerobotic manipulation , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[34]  R. V. Morris,et al.  Volatile, Isotope, and Organic Analysis of Martian Fines with the Mars Curiosity Rover , 2013, Science.

[35]  Jürgen Wahrburg,et al.  Virtual fixtures with autonomous error compensation for human-robot cooperative tasks , 2010, Robotica.

[36]  William Hamel,et al.  ROBOTICS AND INTELLIGENT MACHINES: A DOE CRITICAL TECHNOLOGY ROADMAP , 2001 .

[37]  Blake Hannaford,et al.  Using Kinect and a Haptic Interface for Implementation of Real-Time Virtual Fixture , 2011 .

[38]  François Michaud,et al.  Comparative Analysis of 3-D Robot Teleoperation Interfaces With Novice Users , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[39]  Kyle Schroeder,et al.  Reducing the operator's burden during teleoperation involving contact tasks , 2011 .

[40]  Sachin Chitta,et al.  MoveIt! [ROS Topics] , 2012, IEEE Robotics Autom. Mag..

[41]  Blake Hannaford,et al.  Using Kinect TM and a Haptic Interface for Implementation of Real-Time Virtual Fixtures , 2011 .

[42]  David B. Kaber,et al.  The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task , 2004 .

[43]  David L. Akin,et al.  Space Telerobotics: Unique Challenges to Human–Robot Collaboration in Space , 2013 .

[44]  Wolfram Burgard,et al.  Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters , 2007, IEEE Transactions on Robotics.