Intuitive Robot Teleoperation Through Multi-Sensor Informed Mixed Reality Visual Aids

Mobile robotic systems have evolved to include sensors capable of truthfully describing robot status and operating environment as accurately and reliably as never before. This possibility is challenged by effective sensor data exploitation, because of the cognitive load an operator is exposed to, due to the large amount of data and time-dependency constraints. This paper addresses this challenge in remote-vehicle teleoperation by proposing an intuitive way to present sensor data to users by means of using mixed reality and visual aids within the user interface. We propose a method for organizing information presentation and a set of visual aids to facilitate visual communication of data in teleoperation control panels. The resulting sensor-information presentation appears coherent and intuitive, making it easier for an operator to catch and comprehend information meaning. This increases situational awareness and speeds up decision-making. Our method is implemented on a real mobile robotic system operating outdoor equipped with on-board internal and external sensors, GPS, and a reconstructed 3D graphical model provided by an assistant drone. Experimentation verified feasibility while intuitive and comprehensive visual communication was confirmed through an assessment, which encourages further developments.

[1]  Gerald Seet Gim Lee,et al.  See-through and spatial augmented reality - a novel framework for human-robot interaction , 2017, 2017 3rd International Conference on Control, Automation and Robotics (ICCAR).

[2]  Vikram Kapila,et al.  Toward Mobile Mixed-Reality Interaction With Multi-Robot Systems , 2017, IEEE Robotics and Automation Letters.

[3]  Lindsay W. MacDonald,et al.  Tutorial: Using Color Effectively in Computer Graphics , 1999, IEEE Computer Graphics and Applications.

[4]  Pedro U. Lima,et al.  Search and Rescue Robots: The Civil Protection Teams of the Future , 2012, 2012 Third International Conference on Emerging Security Technologies.

[5]  Petar Kormushev,et al.  ResQbot: A Mobile Rescue Robot with Immersive Teleperception for Casualty Extraction , 2018, TAROS.

[6]  Junhao Xiao,et al.  A three-dimensional mapping and virtual reality-based human–robot interaction for collaborative space exploration , 2020 .

[7]  Christos Papachristos,et al.  Tele-robotics via An Efficient Immersive Virtual Reality Architecture , 2020 .

[8]  Naokazu Yokoya,et al.  Teleoperation of mobile robots by generating augmented free-viewpoint images , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Laurence S. Dooley,et al.  Gestalt theory in visual screen design: a new look at an old subject , 2002 .

[10]  Holly A. Yanco,et al.  Improved interfaces for human-robot interaction in urban search and rescue , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[11]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[12]  Alexander Ferrein,et al.  Intuitive visual teleoperation for UGVs using free-look augmented reality displays , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[13]  Thierry Duval,et al.  Guiding Techniques for Collaborative Exploration in Multi-scale Shared Virtual Environments , 2013, GRAPP/IVAPP.

[14]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[15]  Jing Li,et al.  Outdoor augmented reality tracking using 3D city models and game engine , 2014, 2014 7th International Congress on Image and Signal Processing.

[16]  Ronald L. Boring,et al.  Shared understanding for collaborative control , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[17]  Onur Toker,et al.  A 3-D vision-based man-machine interface for hand-controlled telerobot , 2005, IEEE Transactions on Industrial Electronics.

[18]  Jaroslaw Jankowski,et al.  Usability Evaluation of VR Interface for Mobile Robot Teleoperation , 2015, Int. J. Hum. Comput. Interact..

[19]  Giovanni Muscato,et al.  Stereo Viewing and Virtual Reality Technologies in Mobile Robot Teleguide , 2009, IEEE Transactions on Robotics.

[20]  Antonio Barrientos,et al.  Bringing Adaptive and Immersive Interfaces to Real-World Multi-Robot Scenarios: Application to Surveillance and Intervention in Infrastructures , 2019, IEEE Access.

[21]  Michael A. Goodrich,et al.  Ecological Interfaces for Improving Mobile Robot Teleoperation , 2007, IEEE Transactions on Robotics.

[22]  Mica R. Endsley,et al.  Design and Evaluation for Situation Awareness Enhancement , 1988 .

[23]  Kim J. Vicente,et al.  An experimental evaluation of transparent user interface tools and information content , 1995, UIST '95.

[24]  Jean Scholtz,et al.  Beyond Usability Evaluation: Analysis of Human-Robot Interaction at a Major Robotics Competition , 2004, Hum. Comput. Interact..

[25]  Giovanni Muscato,et al.  A Global Path Planning Strategy for a UGV from Aerial Elevation Maps for Disaster Response , 2017, ICAART.

[26]  Lucio Tommaso De Paolis,et al.  Stereoscopic Visualization and 3-D Technologies in Medical Endoscopic Teleoperation , 2015, IEEE Transactions on Industrial Electronics.

[27]  Petr Novák,et al.  Using HoloLens to create a virtual operator station for mobile robots , 2018, 2018 19th International Carpathian Control Conference (ICCC).

[28]  Riccardo Mazza,et al.  Introduction to Information Visualization , 2009 .

[29]  Robert Meyers,et al.  Real-time photorealistic virtualized reality interface for remote mobile robot control , 2010, ISRR.

[30]  Paul Martin Lester,et al.  Visual Communication: Images with Messages , 1998 .

[31]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[32]  Thanasis Hadzilacos,et al.  HRI usability evaluation of interaction modes for a teleoperated agricultural robotic sprayer. , 2017, Applied ergonomics.

[33]  Ivan E. Sutherland,et al.  A head-mounted three dimensional display , 1968, AFIPS Fall Joint Computing Conference.

[34]  Sven Behnke,et al.  A VR System for Immersive Teleoperation and Live Exploration with a Mobile Robot , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[35]  Martin Jägersand,et al.  SEPO: Selecting by pointing as an intuitive human-robot command interface , 2013, 2013 IEEE International Conference on Robotics and Automation.

[36]  Luca Chittaro,et al.  Navigation in 3D virtual environments: Effects of user experience and location-pointing navigation aids , 2007, Int. J. Hum. Comput. Stud..

[37]  Anand Nayyar,et al.  Smart Surveillance Robot for Real-Time Monitoring and Control System in Environment and Industrial Applications , 2018 .

[38]  Anand Nayyar,et al.  Internet of Robotic Things: Driving Intelligent Robotics of Future - Concept, Architecture, Applications and Technologies , 2018, 2018 4th International Conference on Computing Sciences (ICCS).

[39]  Juha Röning,et al.  The remote operation and environment reconstruction of outdoor mobile robots using virtual reality , 2017, 2017 IEEE International Conference on Mechatronics and Automation (ICMA).

[40]  Robin Williams,et al.  The non-designer's design book: design and typographic principles for the visual novice , 1994 .

[41]  David J. Kasik,et al.  Evaluating Graphic Displays for Complex 3D Models , 2002, IEEE Computer Graphics and Applications.

[42]  Debora Shaw,et al.  Handbook of usability testing: How to plan, design, and conduct effective tests , 1996 .

[43]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[44]  Giovanni Muscato,et al.  3-D Integration of Robot Vision and Laser Data With Semiautomatic Calibration in Augmented Reality Stereoscopic Visual Interface , 2012, IEEE Transactions on Industrial Informatics.

[45]  Anand Nayyar,et al.  Virtual Reality (VR) & Augmented Reality (AR) technologies for tourism and hospitality industry , 2018 .