Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection

Accurately detecting changes in one’s environment is an important ability for many application domains, but can be challenging for humans. Autonomous robots can easily be made to autonomously detect metric changes in the environment, but unlike humans, understanding context can be challenging for robots. We present a novel system that uses an autonomous robot performing point cloud-based change detection to facilitate information-gathering tasks and provides enhanced situational awareness. The robotic system communicates detected changes via augmented reality to a human teammate for evaluation. We present results from a fielded system using two differently-equipped robots to examine implementation questions of point cloud density and its effect on visualization of changes. Our results show that there are trade-offs between implementations that we believe will be constructive towards similar systems in the future.

[1]  Frank Dellaert,et al.  Square Root SAM: Simultaneous Localization and Mapping via Square Root Information Smoothing , 2006, Int. J. Robotics Res..

[2]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[3]  Emanuele Frontoni,et al.  Robotic platform for deep change detection for rail safety and security , 2017, 2017 European Conference on Mobile Robots (ECMR).

[4]  Paula J. Durlach,et al.  Change Blindness and Its Implications for Complex Monitoring and Control Systems Design and Operator Training , 2004, Hum. Comput. Interact..

[5]  Henrik I. Christensen,et al.  OmniMapper: A modular multimodal mapping framework , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[6]  David Baran,et al.  Application of Multi-Robot Systems to Disaster-Relief Scenarios with Limited Communication , 2015, FSR.

[7]  Mario Fernando Montenegro Campos,et al.  Spatial Density Patterns for Efficient Change Detection in 3D Environment for Autonomous Surveillance Robots , 2014, IEEE Transactions on Automation Science and Engineering.

[8]  Stephen R. Marsland,et al.  On-line novelty detection for autonomous mobile robots , 2005, Robotics Auton. Syst..

[9]  Jonathan Fink,et al.  Augmented Reality for Human-Robot Teaming in Field Environments , 2019, HCI.

[10]  David A. Clifton,et al.  A review of novelty detection , 2014, Signal Process..

[11]  Daniel Szafir,et al.  Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality , 2019, HCI.

[12]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[13]  C. Chabris,et al.  Gorillas in Our Midst: Sustained Inattentional Blindness for Dynamic Events , 1999, Perception.

[14]  F. Dellaert Factor Graphs and GTSAM: A Hands-on Introduction , 2012 .

[15]  Kevin Lee,et al.  Come See This! Augmented Reality to Enable Human-Robot Cooperative Search , 2018, 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR).

[16]  Mario Fernando Montenegro Campos,et al.  Novelty detection and 3D shape retrieval based on Gaussian Mixture Models for autonomous surveillance robotics , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  J. Andrew Bagnell,et al.  Anytime online novelty and change detection for mobile robots , 2011, J. Field Robotics.