ACK-MSCKF: Tightly-Coupled Ackermann Multi-State Constraint Kalman Filter for Autonomous Vehicle Localization

Visual-Inertial Odometry (VIO) is subjected to additional unobservable directions under the special motions of ground vehicles, resulting in larger pose estimation errors. To address this problem, a tightly-coupled Ackermann visual-inertial odometry (ACK-MSCKF) is proposed to fuse Ackermann error state measurements and the Stereo Multi-State Constraint Kalman Filter (S-MSCKF) with a tightly-coupled filter-based mechanism. In contrast with S-MSCKF, in which the inertial measurement unit (IMU) propagates the vehicle motion and then the propagation is corrected by stereo visual measurements, we successively update the propagation with Ackermann error state measurements and visual measurements after the process model and state augmentation. This way, additional constraints from the Ackermann measurements are exploited to improve the pose estimation accuracy. Both qualitative and quantitative experimental results evaluated under real-world datasets from an Ackermann steering vehicle lead to the following demonstration: ACK-MSCKF can significantly improve the pose estimation accuracy of S-MSCKF under the special motions of autonomous vehicles, and keep accurate and robust pose estimation available under different vehicle driving cycles and environmental conditions. This paper accompanies the source code for the robotics community.

[1]  Hao Fu,et al.  IMU-Aided High-Frequency Lidar Odometry for Autonomous Driving , 2019, Applied Sciences.

[2]  Dongbing Gu,et al.  A review of visual inertial odometry from filtering and optimisation perspectives , 2015, Adv. Robotics.

[3]  Juan D. Tardós,et al.  Visual-Inertial Monocular SLAM With Map Reuse , 2016, IEEE Robotics and Automation Letters.

[4]  Sunglok Choi,et al.  Simplified epipolar geometry for real-time monocular visual odometry on roads , 2015 .

[5]  YounesGeorges,et al.  Keyframe-based monocular SLAM , 2017 .

[6]  Kevin M. Lynch,et al.  Modern Robotics: Mechanics, Planning, and Control , 2017 .

[7]  Roland Siegwart,et al.  The EuRoC micro aerial vehicle datasets , 2016, Int. J. Robotics Res..

[8]  Stergios I. Roumeliotis,et al.  VINS on wheels , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[9]  Frank Dellaert,et al.  iSAM: Incremental Smoothing and Mapping , 2008, IEEE Transactions on Robotics.

[10]  Kourosh Khoshelham,et al.  Comparative analysis of robust extended Kalman filter and incremental smoothing for UWB/PDR fusion positioning in NLOS environments , 2019, Acta Geodaetica et Geophysica.

[11]  Y. Oshman,et al.  Averaging Quaternions , 2007 .

[12]  Michael Gassner,et al.  SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems , 2017, IEEE Transactions on Robotics.

[13]  Shaojie Shen,et al.  Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration , 2017, IEEE Transactions on Automation Science and Engineering.

[14]  Roland Siegwart,et al.  Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[15]  Hujun Bao,et al.  ICE-BA: Incremental, Consistent and Efficient Bundle Adjustment for Visual-Inertial SLAM , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[16]  Tianmiao Wang,et al.  Tightly-coupled Data Fusion of VINS and Odometer Based on Wheel Slip Estimation , 2018, 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[17]  Daniel Cremers,et al.  Direct Sparse Odometry , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Samir A. Rawashdeh,et al.  Lightweight Visual Odometry for Autonomous Mobile Robots , 2018, Sensors.

[19]  Muhamad Risqi U. Saputra,et al.  Visual SLAM and Structure from Motion in Dynamic Environments , 2018, ACM Comput. Surv..

[20]  Michael Suppa,et al.  Stereo vision based indoor/outdoor navigation for flying robots , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Jongwoo Lim,et al.  Visual-Inertial Odometry with Robust Initialization and Online Scale Estimation , 2018, Sensors.

[23]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[25]  Stergios I. Roumeliotis,et al.  Unobservable Directions of VINS Under Special Motions , 2016 .

[26]  Ji Zhang,et al.  Visual-lidar odometry and mapping: low-drift, robust, and fast , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[27]  Agus Budiyono,et al.  Principles of GNSS, Inertial, and Multi-sensor Integrated Navigation Systems , 2012 .

[28]  Sören Schwertfeger,et al.  RGBD-Inertial Trajectory Estimation and Mapping for Ground Robots , 2019, Sensors.

[29]  Shengbo Eben Li,et al.  Synthesis of Robust Lane Keeping Systems: Impact of Controller and Design Parameters on System Performance , 2019, IEEE Transactions on Intelligent Transportation Systems.

[30]  Davide Scaramuzza,et al.  A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[31]  Roland Siegwart,et al.  Unified temporal and spatial calibration for multi-sensor systems , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[32]  Kourosh Khoshelham,et al.  Vehicle Positioning in GNSS-Deprived Urban Areas by Stereo Visual-Inertial Odometry , 2018, IEEE Transactions on Intelligent Vehicles.

[33]  Thambipillai Srikanthan,et al.  A Framework for Fast and Robust Visual Odometry , 2017, IEEE Transactions on Intelligent Transportation Systems.

[34]  Yun-Hui Liu,et al.  SE(2)-Constrained Visual Inertial Fusion for Ground Vehicles , 2018, IEEE Sensors Journal.

[35]  Rajesh Rajamani,et al.  Vehicle dynamics and control , 2005 .

[36]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[37]  Mohammad H. Marhaban,et al.  Review of visual odometry: types, approaches, challenges, and applications , 2016, SpringerPlus.

[38]  Andreas Zell,et al.  GeRoNa: Generic Robot Navigation , 2019, J. Intell. Robotic Syst..

[39]  Joan Solà,et al.  Quaternion kinematics for the error-state Kalman filter , 2015, ArXiv.

[40]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[41]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[42]  Guoquan Huang,et al.  Robocentric visual–inertial odometry , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[43]  Stefan Leutenegger,et al.  KO-Fusion: Dense Visual SLAM with Tightly-Coupled Kinematic and Odometric Tracking , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[44]  N. Trawny,et al.  Indirect Kalman Filter for 3 D Attitude Estimation , 2005 .

[45]  Rafael Arnay,et al.  Improving Odometric Accuracy for an Autonomous Electric Cart , 2018, Sensors.

[46]  Davide Scaramuzza,et al.  A Tutorial on Quantitative Trajectory Evaluation for Visual(-Inertial) Odometry , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[47]  Pedro Albertos,et al.  Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots , 2013, Sensors.

[48]  Woojin Chung,et al.  Kinematic parameter calibration of a car-like mobile robot to improve odometry accuracy , 2010 .

[49]  Kostas Daniilidis,et al.  PennCOSYVIO: A challenging Visual Inertial Odometry benchmark , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[50]  Daniel Cremers,et al.  Direct Sparse Visual-Inertial Odometry Using Dynamic Marginalization , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[51]  J.L. Crassidis,et al.  Sigma-point Kalman filtering for integrated GPS and inertial navigation , 2005, IEEE Transactions on Aerospace and Electronic Systems.

[52]  Vijay Kumar,et al.  Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight , 2017, IEEE Robotics and Automation Letters.

[53]  Elie A. Shammas,et al.  Keyframe-based monocular SLAM: design, survey, and future directions , 2016, Robotics Auton. Syst..

[54]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[55]  Ji Zhao,et al.  PL-VIO: Tightly-Coupled Monocular Visual–Inertial Odometry Using Point and Line Features , 2018, Sensors.

[56]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[57]  Frank Dellaert,et al.  iSAM2: Incremental smoothing and mapping using the Bayes tree , 2012, Int. J. Robotics Res..

[58]  Jörg Stückler,et al.  Visual-Inertial Mapping With Non-Linear Factor Recovery , 2019, IEEE Robotics and Automation Letters.

[59]  Guoquan Huang,et al.  Visual-Inertial Navigation: A Concise Review , 2019, 2019 International Conference on Robotics and Automation (ICRA).