eVO: A realtime embedded stereo odometry for MAV applications

The navigation of a miniature aerial vehicle (MAV) in GPS-denied environments requires a robust embedded visual localization method. In this paper, we describe a simple but efficient stereo visual odometry algorithm, called eVO, running onboard our quadricopter MAV at video-rate. The proposed eVO algorithm relies on a keyframe scheme which allows to decrease the estimation drift and to reduce the computational cost. We study quantitatively the influence of the main parameters of the algorithm and tune them for optimal performance on various datasets. The eVO algorithm has been submitted to the KITTI odometry benchmark [1] where it ranks first at the date of submission, with an average translational drift of 1.93% and an average angular drift of less than 0.076 degres/m. Besides, we have made several experiments with our MAV with egolocalization given by eVO, for instance for autonomous 3D environment modeling.

[1]  Hans P. Moravec Obstacle avoidance and navigation in the real world by a seeing robot rover , 1980 .

[2]  Jennifer A. Scott,et al.  Algorithm 891: A Fortran virtual memory system , 2009, TOMS.

[3]  G. Gerhart,et al.  Stereo vision and laser odometry for autonomous helicopters in GPS-denied indoor environments , 2009 .

[4]  Michel Dhome,et al.  Real Time Localization and 3D Reconstruction , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[5]  Kurt Konolige,et al.  Large-Scale Visual Odometry for Rough Terrain , 2007, ISRR.

[6]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.

[7]  P. Rousseeuw Least Median of Squares Regression , 1984 .

[8]  Simon Lacroix,et al.  Position estimation in outdoor environments using pixel tracking and stereovision , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[9]  Ian D. Reid,et al.  Adaptive relative bundle adjustment , 2009, Robotics: Science and Systems.

[10]  Saied Moezzi,et al.  Dynamic stereo vision , 1992 .

[11]  Gaurav S. Sukhatme,et al.  AN EXPERIMENTAL STUDY OF AERIAL STEREO VISUAL ODOMETRY , 2007 .

[12]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[13]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[14]  Gaurav S. Sukhatme,et al.  Combined Visual and Inertial Navigation for an Unmanned Aerial Vehicle , 2008, FSR.

[15]  Edwin Olson,et al.  AprilTag: A robust and flexible visual fiducial system , 2011, 2011 IEEE International Conference on Robotics and Automation.

[16]  Andrew Howard,et al.  Real-time stereo visual odometry for autonomous ground vehicles , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Marc Pollefeys,et al.  Vision-based autonomous mapping and exploration using a quadrotor MAV , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  James R. Bergen,et al.  Visual odometry for ground vehicle applications , 2006, J. Field Robotics.

[19]  Friedrich Fraundorfer,et al.  Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .

[20]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[21]  Andreas Geiger,et al.  Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[22]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[23]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[24]  Wolfram Burgard,et al.  G2o: A general framework for graph optimization , 2011, 2011 IEEE International Conference on Robotics and Automation.

[25]  Patrick Rives,et al.  Accurate Quadrifocal Tracking for Robust 3D Visual Odometry , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[26]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[27]  F. Fraundorfer,et al.  Visual Odometry : Part II: Matching, Robustness, Optimization, and Applications , 2012, IEEE Robotics & Automation Magazine.

[28]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[29]  Ian D. Reid,et al.  RSLAM: A System for Large-Scale Mapping in Constant-Time Using Stereo , 2011, International Journal of Computer Vision.

[30]  Manolis I. A. Lourakis,et al.  SBA: A software package for generic sparse bundle adjustment , 2009, TOMS.

[31]  Frank Dellaert,et al.  Visual odometry priors for robust EKF-SLAM , 2010, 2010 IEEE International Conference on Robotics and Automation.

[32]  Frank Dellaert,et al.  Flow separation for fast and robust stereo odometry , 2009, 2009 IEEE International Conference on Robotics and Automation.