MIMOSA: A Multi-Modal SLAM Framework for Resilient Autonomy against Sensor Degradation

This paper presents a framework for Multi-Modal SLAM (MIMOSA) that utilizes a nonlinear factor graph as the underlying representation to provide loosely-coupled fusion of any number of sensing modalities. Tailored to the goal of enabling resilient robotic autonomy in GPS-denied and perceptually-degraded environments, MIMOSA currently contains modules for pointcloud registration, fusion of multiple odometry estimates relying on visible-light and thermal vision, as well as inertial measurement propagation. A flexible back-end utilizes the estimates from various modalities as relative transformation factors. The method is designed to be robust to degeneracy through the maintenance and tracking of modality-specific health metrics, while also being inherently tolerant to sensor failure. We detail this framework alongside our implementation for handling high-rate asynchronous sensor measurements and evaluate its performance on data from autonomous subterranean robotic exploration missions using legged and aerial robots.

[1]  Paolo De Petris,et al.  RMF-Owl: A Collision-Tolerant Flying Robot for Autonomous Subterranean Exploration , 2022, 2022 International Conference on Unmanned Aircraft Systems (ICUAS).

[2]  Roland Siegwart,et al.  CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge , 2022, Field Robotics.

[3]  Fu Zhang,et al.  R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package , 2021, 2022 International Conference on Robotics and Automation (ICRA).

[4]  Yidong Lou,et al.  MetroLoc: Metro Vehicle Mapping and Localization with LiDAR-Camera-Inertial Integration , 2021, ArXiv.

[5]  Lihua Xie,et al.  VIRAL SLAM: Tightly Coupled Camera-IMU-UWB-Lidar SLAM , 2021, ArXiv.

[6]  Sebastian Scherer,et al.  Super Odometry: IMU-centric LiDAR-Visual-Inertial Estimator for Challenging Environments , 2021, 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[7]  Carlo Ratti,et al.  LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping , 2021, 2021 IEEE International Conference on Robotics and Automation (ICRA).

[8]  M. Fallon,et al.  Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry , 2020, IEEE Robotics and Automation Letters.

[9]  Peng Wang,et al.  TP-TIO: A Robust Thermal-Inertial Odometry with Deep ThermalPoint , 2020, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[10]  Kostas Alexis,et al.  Complementary Multi–Modal Sensor Fusion for Resilient Robot Pose Estimation in Subterranean Environments , 2020, 2020 International Conference on Unmanned Aircraft Systems (ICUAS).

[11]  Marc Pollefeys,et al.  LIC-Fusion 2.0: LiDAR-Inertial-Camera Odometry with Sliding-Window Plane-Feature Tracking , 2020, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[12]  Wei Wang,et al.  LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping , 2020, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[13]  Jianwei Zhang,et al.  Robust High Accuracy Visual-Inertial-Laser SLAM System , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[14]  Christos Papachristos,et al.  Robust Thermal-Inertial Localization for Aerial Robots: A Case for Direct Methods , 2019, 2019 International Conference on Unmanned Aircraft Systems (ICUAS).

[15]  Christos Papachristos,et al.  Keyframe-based Direct Thermal–Inertial Odometry , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[16]  Ji Zhang,et al.  Laser–visual–inertial odometry and mapping with high robustness and low drift , 2018, J. Field Robotics.

[17]  Ji Zhang,et al.  On degeneracy of optimization-based state estimation problems , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[19]  Davide Scaramuzza,et al.  SVO: Fast semi-direct monocular visual odometry , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[20]  Ji Zhang,et al.  LOAM: Lidar Odometry and Mapping in Real-time , 2014, Robotics: Science and Systems.

[21]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  F. Dellaert Factor Graphs and GTSAM: A Hands-on Introduction , 2012 .

[23]  Ian D. Reid,et al.  On the comparison of uncertainty criteria for active SLAM , 2012, 2012 IEEE International Conference on Robotics and Automation.

[24]  Frank Dellaert,et al.  iSAM2: Incremental smoothing and mapping using the Bayes tree , 2012, Int. J. Robotics Res..

[25]  Sebastian Thrun,et al.  Towards fully autonomous driving: Systems and algorithms , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[26]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[27]  Lee Lacy,et al.  Defense Advanced Research Projects Agency (DARPA) Agent Markup Language Computer Aided Knowledge Acquisition , 2005 .