Come See This! Augmented Reality to Enable Human-Robot Cooperative Search

Ahstract-Robots operating alongside humans in field environments have the potential to greatly increase the situational awareness of their human teammates. A significant challenge, however, is the efficient conveyance of what the robot perceives to the human in order to achieve improved situational awareness. We believe augmented reality (AR), which allows a human to simultaneously perceive the real world and digital information situated virtually in the real world, has the potential to address this issue. Motivated by the emerging prevalence of practical human-wearable AR devices, we present a system that enables a robot to perform cooperative search with a human teammate, where the robot can both share search results and assist the human teammate in navigation to the search target. We demonstrate this ability in a search task in an uninstrumented environment where the robot identifies and localizes targets and provides navigation direction via AR to bring the human to the correct target.

[1]  James A. R. Marshall,et al.  ARK: Augmented Reality for Kilobots , 2017, IEEE Robotics and Automation Letters.

[2]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[3]  David Baran,et al.  Application of Multi-Robot Systems to Disaster-Relief Scenarios with Limited Communication , 2015, FSR.

[4]  Ravi Teja Chadalavada,et al.  That's on my mind! robot to human intention communication through on-board projection on shared floor space , 2015, 2015 European Conference on Mobile Robots (ECMR).

[5]  Manuela M. Veloso,et al.  Visualizing robot behaviors as automated video annotations: A case study in robot soccer , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[6]  Jennifer Lee,et al.  Communicating Robot Motion Intent with Augmented Reality , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Dieter Schmalstieg,et al.  Drone-Augmented Human Vision: Exocentric Control for Drones Exploring Hidden Areas , 2018, IEEE Transactions on Visualization and Computer Graphics.

[8]  Vikram Kapila,et al.  Mobile Mixed-Reality Interfaces That Enhance Human–Robot Interaction in Shared Spaces , 2017, Front. Robot. AI.

[9]  Subbarao Kambhampati,et al.  Alternative Modes of Interaction in Proximal Human-in-the-Loop Operation of Robots , 2017, ArXiv.

[10]  R. Stephenson A and V , 1962, The British journal of ophthalmology.

[11]  Yuichiro Yoshikawa,et al.  Simulator platform that enables social interaction simulation — SIGVerse: SocioIntelliGenesis simulator , 2010, 2010 IEEE/SICE International Symposium on System Integration.

[12]  Edwin Olson,et al.  AprilTag: A robust and flexible visual fiducial system , 2011, 2011 IEEE International Conference on Robotics and Automation.