Robust Visual-Inertial Integrated Navigation System Aided by Online Sensor Model Adaption for Autonomous Ground Vehicles in Urban Areas

The visual-inertial integrated navigation system (VINS) has been extensively studied over the past decades to provide accurate and low-cost positioning solutions for autonomous systems. Satisfactory performance can be obtained in an ideal scenario with sufficient and static environment features. However, there are usually numerous dynamic objects in deep urban areas, and these moving objects can severely distort the feature-tracking process which is critical to the feature-based VINS. One well-known method that mitigates the effects of dynamic objects is to detect vehicles using deep neural networks and remove the features belonging to surrounding vehicles. However, excessive feature exclusion can severely distort the geometry of feature distribution, leading to limited visual measurements. Instead of directly eliminating the features from dynamic objects, this study proposes to adopt the visual measurement model based on the quality of feature tracking to improve the performance of the VINS. First, a self-tuning covariance estimation approach is proposed to model the uncertainty of each feature measurement by integrating two parts: (1) the geometry of feature distribution (GFD); (2) the quality of feature tracking. Second, an adaptive M-estimator is proposed to correct the measurement residual model to further mitigate the effects of outlier measurements, like the dynamic features. Different from the conventional M-estimator, the proposed method effectively alleviates the reliance on the excessive parameterization of the M-estimator. Experiments were conducted in typical urban areas of Hong Kong with numerous dynamic objects. The results show that the proposed method could effectively mitigate the effects of dynamic objects and improved accuracy of the VINS is obtained when compared with the conventional VINS method.

[1]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[2]  Andre Lucas,et al.  Robustness of the student t based M-estimator , 1997 .

[3]  Li-Ta Hsu,et al.  Tightly Coupled GNSS/INS Integration via Factor Graph and Aided by Fish-Eye Camera , 2019, IEEE Transactions on Vehicular Technology.

[4]  Li-Ta Hsu,et al.  NLOS Correction/Exclusion for GNSS Measurement Using RAIM and City Building Models , 2015, Sensors.

[5]  Javier Civera,et al.  DynaSLAM: Tracking, Mapping, and Inpainting in Dynamic Scenes , 2018, IEEE Robotics and Automation Letters.

[6]  Mehran Yazdi,et al.  New trends on moving object detection in video images captured by a moving camera: A survey , 2018, Comput. Sci. Rev..

[7]  Yi Lin,et al.  Autonomous aerial navigation using monocular visual‐inertial fusion , 2018, J. Field Robotics.

[8]  Frank Dellaert,et al.  On-Manifold Preintegration for Real-Time Visual--Inertial Odometry , 2015, IEEE Transactions on Robotics.

[9]  Mohammad Bozorg,et al.  SLAM in Dynamic Environments: A Deep Learning Approach for Moving Object Tracking Using ML-RANSAC Algorithm , 2019, Sensors.

[10]  Zheng Rong,et al.  Dynamic-SLAM: Semantic monocular visual localization and mapping based on deep learning in dynamic environment , 2019, Robotics Auton. Syst..

[11]  Mustapha Hamerlain,et al.  Cooperative Visual SLAM based on Adaptive Covariance Intersection , 2018, J. Adv. Eng. Comput..

[12]  Yuxiang Sun,et al.  Improving RGB-D SLAM in dynamic environments: A motion removal approach , 2017, Robotics Auton. Syst..

[13]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[14]  David E. Tyler A Distribution-Free $M$-Estimator of Multivariate Scatter , 1987 .

[15]  Guanghui Wang,et al.  Direct visual-inertial odometry with semi-dense mapping , 2018, Comput. Electr. Eng..

[16]  Muhamad Risqi U. Saputra,et al.  Visual SLAM and Structure from Motion in Dynamic Environments , 2018, ACM Comput. Surv..

[17]  Yuxiang Sun,et al.  Motion removal for reliable RGB-D SLAM in dynamic environments , 2018, Robotics Auton. Syst..

[18]  Mingquan Lu,et al.  A robust graph optimization realization of tightly coupled GNSS/INS integrated navigation system for urban vehicles , 2018, Tsinghua Science and Technology.