Informational Framework for Minimalistic Visual Odometry on Outdoor Robot

In an unknown environment, assessing the robot trajectory in real time is one of the key issues for a successful robotic mission. In such an environment, the absolute measurements, such as the GPS data, may be unavailable. Moreover, estimating the position using only proprioceptive sensors, such as encoders and inertial measurement units (IMUs), will generate errors that increase over time. This paper presents a multisensor fusion approach between IMU and ground optical flow used to estimate the position of a mobile robot while ensuring high integrity localization. The data fusion is done through the informational form of the Kalman filter, namely, information filter (IF). A fault detection and exclusion (FDE) step is added in order to exclude the erroneous measurements from the fusion procedure by making it fault tolerant and to ensure a high localization performance. The approach is based on the use of the IF for the state estimation and tools from the information theory for the FDE. Our proposed approach evaluates the quality of a measurement based on the amount of information it provides using informational metrics such as the Kullback–Leibler divergence. The approach is validated on data obtained from experiments performed in outdoor environments under various conditions, including high-dynamic-range lighting and different ground textures.

[1]  Roland Siegwart,et al.  Real-time monocular visual odometry for on-road vehicles with 1-point RANSAC , 2009, 2009 IEEE International Conference on Robotics and Automation.

[2]  Christophe Macabiau,et al.  GNSS/IRS Hybridization: Fault Detection and Isolation of More than One Range Failure , 2002 .

[3]  Franck Ruffier,et al.  Minimalistic optic flow sensors applied to indoor and outdoor visual guidance and odometry on a car-like robot , 2016, Bioinspiration & biomimetics.

[4]  Brett Browning,et al.  Direct Visual Odometry in Low Light Using Binary Descriptors , 2017, IEEE Robotics and Automation Letters.

[5]  F. Ruffier,et al.  Two-Directional 1-g Visual Motion Sensor Inspired by the Fly's Eye , 2013, IEEE Sensors Journal.

[6]  Giuseppe Oriolo,et al.  Feedback control of a nonholonomic car-like robot , 1998 .

[7]  Larry H. Matthies,et al.  Two years of Visual Odometry on the Mars Exploration Rovers , 2007, J. Field Robotics.

[8]  Fakhri Karray,et al.  Multisensor data fusion: A review of the state-of-the-art , 2013, Inf. Fusion.

[9]  Kostas Daniilidis,et al.  Event-Based Visual Inertial Odometry , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[10]  Yufei Huang,et al.  Sensor validation for flight control by particle filtering , 2005, 2005 13th European Signal Processing Conference.

[11]  A. Jolly-Desodt,et al.  Fault Detection by Combining Redundant Sensors : a Conflict Approach Within the TBM Framework , 2007 .

[12]  Maan El Badaoui El Najjar,et al.  Multi-sensor fusion approach with fault detection and exclusion based on the KullbackLeibler Divergence , 2017 .

[13]  Benjamin Grocholsky,et al.  Outdoor Downward-Facing Optical Flow Odometry with Commodity Sensors , 2009, FSR.

[14]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[15]  Yang Cheng,et al.  Path following using visual odometry for a Mars rover in high-slip environments , 2004, 2004 IEEE Aerospace Conference Proceedings (IEEE Cat. No.04TH8720).

[16]  Oceans,et al.  Oceans engineering for today's technology and tomorrow's preservation : proceedings , 1994 .

[17]  Gaurav S. Sukhatme,et al.  Sensor fault detection and identification in a mobile robot , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[18]  Inderjit S. Dhillon,et al.  Differential Entropic Clustering of Multivariate Gaussians , 2006, NIPS.

[19]  John Atkinson,et al.  Sensor validation and fusion using the Nadaraya-Watson statistical estimator , 2002, Proceedings of the Fifth International Conference on Information Fusion. FUSION 2002. (IEEE Cat.No.02EX5997).

[20]  Hugh Durrant-Whyte,et al.  Introduction to Decentralised Data Fusion , 2006 .

[21]  Ahmed Frikha,et al.  Analytic hierarchy process for multi-sensor data fusion based on belief function theory , 2015, Eur. J. Oper. Res..

[22]  Inderjit S. Dhillon,et al.  Learning low-rank kernel matrices , 2006, ICML.

[23]  R. L. Marks,et al.  Automatic visual station keeping of an underwater robot , 1994, Proceedings of OCEANS'94.

[24]  M. Kumar,et al.  A Method for Judicious Fusion of Inconsistent Multiple Sensor Data , 2007, IEEE Sensors Journal.

[25]  Kazuya Yoshida,et al.  Noncontact position estimation device with optical sensor and laser sources for mobile robots traversing slippery terrains , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Davide Scaramuzza,et al.  Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios , 2017, IEEE Robotics and Automation Letters.

[27]  Maan El Badaoui El Najjar,et al.  Multi-sensor fusion approach with fault detection and exclusion based on the Kullback-Leibler Divergence: Application on collaborative multi-robot system , 2017, Inf. Fusion.