Efficient on-board Stereo SLAM through constrained-covisibility strategies

Abstract Visual SLAM is a computationally expensive task, with a complexity that grows unbounded as the size of the explored area increases. This becomes an issue when targeting embedded applications such as on-board localization on Micro Aerial Vehicles (MAVs), where real-time execution is mandatory and computational resources are a limiting factor. The herein proposed method introduces a covisibility-graph based map representation which allows a visual SLAM system to execute with a complexity that does not depend on the size of the map. The proposed structure allows to efficiently select locally relevant portions of the map to be optimized in such a way that the results resemble performing a full optimization on the whole trajectory. We build on S-PTAM (Stereo Parallel Tracking and Mapping), yielding an accurate and robust stereo SLAM system capable to work in real-time under limited hardware constraints such as those present in MAVs. The developed SLAM system in assessed using the EuRoC dataset. Results show that covisibility-graph based map culling allows the SLAM system to run in real-time even on a low-resource embedded computer. The impact of each SLAM task on the overall system performance is analyzed in detail and the SLAM system is compared with state-of-the-art methods to validate the presented approach.

[1]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[2]  Guy Le Besnerais,et al.  eVO: A realtime embedded stereo odometry for MAV applications , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[5]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  Vincent Lepetit,et al.  BRIEF: Binary Robust Independent Elementary Features , 2010, ECCV.

[7]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[8]  Roland Siegwart,et al.  A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM , 2014, ICRA 2014.

[9]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[10]  Andreas Zell,et al.  On-Board Dual-Stereo-Vision for the Navigation of an Autonomous MAV , 2013, Journal of Intelligent & Robotic Systems.

[11]  Javier Civera,et al.  S-PTAM: Stereo Parallel Tracking and Mapping , 2017, Robotics Auton. Syst..

[12]  Paul Newman,et al.  Closing loops without places , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[14]  Roland Siegwart,et al.  The EuRoC micro aerial vehicle datasets , 2016, Int. J. Robotics Res..

[15]  Stefano Soatto,et al.  Visual-inertial navigation, mapping and localization: A scalable real-time causal approach , 2011, Int. J. Robotics Res..

[16]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.

[17]  Dorian Gálvez-López,et al.  Bags of Binary Words for Fast Place Recognition in Image Sequences , 2012, IEEE Transactions on Robotics.

[18]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[19]  Michael Bosse,et al.  Get Out of My Lab: Large-scale, Real-Time Visual-Inertial Localization , 2015, Robotics: Science and Systems.

[20]  Torsten Sattler,et al.  Improving Image-Based Localization by Active Correspondence Search , 2012, ECCV.

[21]  Andreas Zell,et al.  Markerless Visual Control of a Quad-Rotor Micro Aerial Vehicle by Means of On-Board Stereo Processing , 2012, AMS.

[22]  Juan D. Tardós,et al.  Fast relocalisation and loop closing in keyframe-based SLAM , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[23]  Hongdong Li,et al.  UPnP: An Optimal O(n) Solution to the Absolute Pose Problem with Universal Applicability , 2014, ECCV.

[24]  Kurt Konolige,et al.  Double window optimisation for constant time visual SLAM , 2011, 2011 International Conference on Computer Vision.

[25]  Pablo de Cristóforis,et al.  Constrained-covisibility marginalization for efficient on-board stereo SLAM , 2017, 2017 European Conference on Mobile Robots (ECMR).

[26]  Kurt Konolige,et al.  FrameSLAM: From Bundle Adjustment to Real-Time Visual Mapping , 2008, IEEE Transactions on Robotics.

[27]  Vijay Kumar,et al.  Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight , 2017, IEEE Robotics and Automation Letters.

[28]  Javier Civera,et al.  Stereo parallel tracking and mapping for robot localization , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[29]  Roland Siegwart,et al.  Real-time visual-inertial mapping, re-localization and planning onboard MAVs in unknown environments , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[30]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[31]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[32]  Roland Siegwart,et al.  Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[33]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[34]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..