Semi-independent Stereo Visual Odometry for Different Field of View Cameras

This paper presents a pipeline for stereo visual odometry using cameras with different fields of view. It gives a proof of concept about how a constraint on the respective field of view of each camera can lead to both an accurate 3D reconstruction and a robust pose estimation. Indeed, when considering a fixed resolution, a narrow field of view has a higher angular resolution and can preserve image texture details. On the other hand, a wide field of view allows to track features over longer periods since the overlap between two successive frames is more substantial. We propose a semi-independent stereo system where each camera performs individually temporal multi-view optimization but their initial parameters are still jointly optimized in an iterative framework. Furthermore, the concept of lead and follow camera is introduced to adaptively propagate information between the cameras. We evaluate the method qualitatively on two indoor datasets, and quantitatively on a synthetic dataset to allow the comparison across different fields of view.

[1]  Daniel Cremers,et al.  Stereo DSO: Large-Scale Direct Sparse Visual Odometry with Stereo Cameras , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[2]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[3]  Michael Gassner,et al.  SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems , 2017, IEEE Transactions on Robotics.

[4]  Lina María Paz,et al.  Large-Scale 6-DOF SLAM With Stereo-in-Hand , 2008, IEEE Transactions on Robotics.

[5]  Javier Civera,et al.  Stereo parallel tracking and mapping for robot localization , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[6]  Daniel Cremers,et al.  LSD-SLAM: Large-Scale Direct Monocular SLAM , 2014, ECCV.

[7]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[9]  Daniel Cremers,et al.  Large-scale direct SLAM for omnidirectional cameras , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[10]  Davide Scaramuzza,et al.  SVO: Fast semi-direct monocular visual odometry , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Alejandro Rituerto,et al.  ´Indice de Contenidos A Comparison of omnidirectional and conventional monocular systems for visual SLAM , 2010 .

[12]  Andrew J. Davison,et al.  DTAM: Dense tracking and mapping in real-time , 2011, 2011 International Conference on Computer Vision.

[13]  Daniel Cremers,et al.  Direct Sparse Odometry , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Jörg Stückler,et al.  Large-scale direct SLAM with stereo cameras , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[15]  Jan-Michael Frahm,et al.  Comparative Evaluation of Binary Features , 2012, ECCV.

[16]  Davide Scaramuzza,et al.  Benefit of large field-of-view cameras for visual odometry , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[17]  James R. Bergen,et al.  Visual odometry , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[18]  Vincent Lepetit,et al.  BRIEF: Binary Robust Independent Elementary Features , 2010, ECCV.