Dense visual-inertial navigation system for mobile robots

Real-time dense mapping and pose estimation is essential for a wide range of navigation tasks in mobile robotic applications. We propose an odometry and mapping system that leverages the full photometric information from a stereo-vision system as well as inertial measurements in a probabilistic framework while running in real-time on a single low-power Intel CPU core. Instead of performing mapping and localization on a set of sparse image features, we use the complete dense image intensity information in our navigation system. By incorporating a probabilistic model of the stereo sensor and the IMU, we can robustly estimate the ego-motion as well as a dense 3D model of the environment in real-time. The probabilistic formulation of the joint odometry estimation and mapping process enables to efficiently reject temporal outliers in ego-motion estimation as well as spatial outliers in the mapping process. To underline the versatility of the proposed navigation system, we evaluate it in a set of experiments on a multi-rotor system as well as on a quadrupedal walking robot. We tightly integrate our framework into the stabilization-loop of the UAV and the mapping framework of the walking robot. It is shown that the dense framework exhibits good tracking and mapping performance in terms of accuracy as well as robustness in scenarios with highly dynamic motion patterns while retaining a relatively small computational footprint. This makes it an ideal candidate for control and navigation tasks in unstructured GPS-denied environments, for a wide range of robotic platforms with power and weight constraints. The proposed framework is released as an open-source ROS package.

[1]  Julius Ziegler,et al.  StereoScan: Dense 3d reconstruction in real-time , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[2]  Andrew I. Comport,et al.  Real-time dense appearance-based SLAM for RGB-D sensors , 2011 .

[3]  G. Gerhart,et al.  Stereo vision and laser odometry for autonomous helicopters in GPS-denied indoor environments , 2009 .

[4]  Andrew Howard,et al.  Real-time stereo visual odometry for autonomous ground vehicles , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  H. Hirschmüller Accurate and Efficient Stereo Processing by Semi-Global Matching and Mutual Information , 2005, CVPR.

[6]  Heiko Hirschmüller,et al.  Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device , 2013, 2013 IEEE International Conference on Robotics and Automation.

[7]  Kurt Konolige,et al.  Small Vision Systems: Hardware and Implementation , 1998 .

[8]  Daniel Cremers,et al.  LSD-SLAM: Large-Scale Direct Monocular SLAM , 2014, ECCV.

[9]  Roland Siegwart,et al.  Walking and Running with StarlETH , 2013 .

[10]  Roland Siegwart,et al.  A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM , 2014, ICRA 2014.

[11]  Albert S. Huang,et al.  Visual Odometry and Mapping for Autonomous Flight Using an RGB-D Camera , 2011, ISRR.

[12]  Roland Siegwart,et al.  Versatile distributed pose estimation and sensor self-calibration for an autonomous MAV , 2012, 2012 IEEE International Conference on Robotics and Automation.

[13]  Andreas Geiger,et al.  Efficient Large-Scale Stereo Matching , 2010, ACCV.

[14]  Daniel Cremers,et al.  Robust odometry estimation for RGB-D cameras , 2013, 2013 IEEE International Conference on Robotics and Automation.

[15]  Robert C. Bolles,et al.  Localization and Mapping for Autonomous Navigation in Outdoor Terrains : A Stereo Vision Approach , 2007, 2007 IEEE Workshop on Applications of Computer Vision (WACV '07).

[16]  Patrick Rives,et al.  Real-time Quadrifocal Visual Odometry , 2010, Int. J. Robotics Res..