Fusion of optical flow and inertial measurements for robust egomotion estimation

In this paper we present a method for fusing optical flow and inertial measurements. To this end, we derive a novel visual error term which is better suited than the standard continuous epipolar constraint for extracting the information contained in the optical flow measurements. By means of an unscented Kalman filter (UKF), this information is then tightly coupled with inertial measurements in order to estimate the egomotion of the sensor setup. The individual visual landmark positions are not part of the filter state anymore. Thus, the dimensionality of the state space is significantly reduced, allowing for a fast online implementation. A nonlinear observability analysis is provided and supports the proposed method from a theoretical side. The filter is evaluated on real data together with ground truth from a motion capture system.

[1]  Ian D. Reid,et al.  RSLAM: A System for Large-Scale Mapping in Constant-Time Using Stereo , 2011, International Journal of Computer Vision.

[2]  Hauke Strasdat,et al.  Real-time monocular SLAM: Why filter? , 2010, 2010 IEEE International Conference on Robotics and Automation.

[3]  Sammy Omari,et al.  Metric visual-inertial navigation system using single optical flow feature , 2013, 2013 European Control Conference (ECC).

[4]  Roland Siegwart,et al.  State estimation for legged robots on unstable and slippery terrain , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Xiaoji Niu,et al.  Analysis and Modeling of Inertial Sensors Using Allan Variance , 2008, IEEE Transactions on Instrumentation and Measurement.

[6]  Stefano Soatto,et al.  Visual-inertial navigation, mapping and localization: A scalable real-time causal approach , 2011, Int. J. Robotics Res..

[7]  Seth J. Teller,et al.  Epipolar Constraints for Vision-Aided Inertial Navigation , 2005, 2005 Seventh IEEE Workshops on Applications of Computer Vision (WACV/MOTION'05) - Volume 1.

[8]  Ian D. Reid,et al.  Mapping Large Loops with a Single Hand-Held Camera , 2007, Robotics: Science and Systems.

[9]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[10]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[11]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[12]  Javier Civera,et al.  Unified Inverse Depth Parametrization for Monocular SLAM , 2006, Robotics: Science and Systems.

[13]  Roland Siegwart,et al.  Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization , 2013, Robotics: Science and Systems.

[14]  Kurt Konolige,et al.  Double window optimisation for constant time visual SLAM , 2011, 2011 International Conference on Computer Vision.

[15]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[16]  A. Krener,et al.  Nonlinear controllability and observability , 1977 .

[17]  Frank Dellaert,et al.  iSAM2: Incremental smoothing and mapping using the Bayes tree , 2012, Int. J. Robotics Res..

[18]  Andrew W. Fitzgibbon,et al.  Bundle Adjustment - A Modern Synthesis , 1999, Workshop on Vision Algorithms.

[19]  Simon J. Julier,et al.  The scaled unscented transformation , 2002, Proceedings of the 2002 American Control Conference (IEEE Cat. No.CH37301).

[20]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.

[21]  Roland Siegwart,et al.  Monocular Vision for Long‐term Micro Aerial Vehicle State Estimation: A Compendium , 2013, J. Field Robotics.