Enabling Intuitive Human-Robot Teaming Using Augmented Reality and Gesture Control

Human-robot teaming offers great potential because of the opportunities to combine strengths of heterogeneous agents. However, one of the critical challenges in realizing an effective human-robot team is efficient information exchange - both from the human to the robot as well as from the robot to the human. In this work, we present and analyze an augmented reality-enabled, gesture-based system that supports intuitive human-robot teaming through improved information exchange. Our proposed system requires no external instrumentation aside from human-wearable devices and shows promise of real-world applicability for service-oriented missions. Additionally, we present preliminary results from a pilot study with human participants, and highlight lessons learned and open research questions that may help direct future development, fielding, and experimentation of autonomous HRI systems.

[1]  Michael J. Barnes,et al.  Gesture-Based Controls for Robots: Overview and Implications for Use by Soldiers , 2016 .

[2]  Stefanie Tellex,et al.  Flight, Camera, Action! Using Natural Language and Mixed Reality to Control a Drone , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[3]  Tom Williams,et al.  The Reality-Virtuality Interaction Cube , 2019 .

[4]  Jonathan R. Fink,et al.  Mapping with a ground robot in GPS denied and degraded environments , 2014, 2014 American Control Conference.

[5]  David Baran,et al.  Application of Multi-Robot Systems to Disaster-Relief Scenarios with Limited Communication , 2015, FSR.

[6]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[7]  Vincenzo Lippiello,et al.  Multimodal Interaction with Multiple Co-located Drones in Search and Rescue Missions , 2016, ArXiv.

[8]  Elizabeth Boyle,et al.  Mixed Reality Deictic Gesture for Multi-Modal Robot Communication , 2019, 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Jonathan Fink,et al.  Augmented Reality for Human-Robot Teaming in Field Environments , 2019, HCI.

[10]  David Whitney,et al.  ROS Reality: A Virtual Reality Framework Using Consumer-Grade Hardware for ROS-Enabled Robots , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[11]  Daniel Szafir,et al.  Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality , 2019, HCI.

[12]  Stefanie Tellex,et al.  End-User Robot Programming Using Mixed Reality , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[13]  Kevin Lee,et al.  Come See This! Augmented Reality to Enable Human-Robot Cooperative Search , 2018, 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR).

[14]  Brian Yamauchi,et al.  A frontier-based approach for autonomous exploration , 1997, Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation CIRA'97. 'Towards New Computational Principles for Robotics and Automation'.

[15]  John G. Rogers,et al.  Autonomous Exploration Using an Information Gain Metric , 2016 .

[16]  Jennifer Lee,et al.  Communicating Robot Motion Intent with Augmented Reality , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).