Positioning and navigation of mobile robot with asynchronous fusion of binocular vision system and inertial navigation system

Binocular stereovision–based positioning and inertial measurement–based positioning have their respective limitations. Asynchronous fusion of a binocular vision system and an inertial navigation system (INS) is therefore introduced to global positioning system–denied environments with a fuzzy map. It aims to provide a sequential and progressive update with regard to mobile robot positioning and navigation. The system consists of two off-the-shelf cameras and a low-cost inertial measurement unit (IMU). The localization procedure fuses the inertial data from the IMU and the absolute position data from the binocular vision system based on corners. The main contribution of this article is a novel fusion method adaptive to different data rates at which the two modules operate. Utilization of an asynchronous Kalman filter is proposed to fuse the results from the two modules, which can achieve intermittent correction for INS localization. Experiments were carried out in an indoor laboratory environment where dynamic tests validated the reliability and effectiveness of the proposed asynchronous fusion algorithm.

[1]  Yuanxin Wu,et al.  Velocity/Position Integration Formula Part II: Application to Strapdown Inertial Navigation Computation , 2012, IEEE Transactions on Aerospace and Electronic Systems.

[2]  Jean-Yves Bouguet,et al.  Camera calibration toolbox for matlab , 2001 .

[3]  Anastasios I. Mourikis,et al.  Vision-aided inertial navigation with line features and a rolling-shutter camera , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  Wang Ya-na Real-time Control System Based on Windows System , 2006 .

[5]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[6]  Gabe Sibley,et al.  Asynchronous Adaptive Conditioning for Visual-Inertial SLAM , 2014, ISER.

[7]  Henk Nijmeijer,et al.  Direct Motion Planning for Vision-Based Control , 2014, IEEE Transactions on Automation Science and Engineering.

[8]  Kyoung-Ho Choi,et al.  Ground vehicle navigation in harsh urban conditions by integrating inertial navigation system, global positioning system, odometer and vision data , 2011 .

[9]  Carlo L. Bottasso,et al.  Tightly-coupled stereo vision-aided inertial navigation using feature-based motion sensors , 2014, Adv. Robotics.

[10]  James Diebel,et al.  Representing Attitude : Euler Angles , Unit Quaternions , and Rotation Vectors , 2006 .

[11]  Mamoun F. Abdel-Hafez,et al.  Estimating Vehicle State by GPS/IMU Fusion with Vehicle Dynamics , 2013, 2013 International Conference on Unmanned Aircraft Systems (ICUAS).

[12]  Xiaoping Hu,et al.  Fusing Stereo Camera and Low-Cost Inertial Measurement Unit for Autonomous Navigation in a Tightly-Coupled Approach , 2015 .

[13]  Dimitrios G. Kottas,et al.  Consistency Analysis and Improvement of Vision-aided Inertial Navigation , 2014, IEEE Transactions on Robotics.

[14]  T. Lindeberg,et al.  Scale-Space Theory : A Basic Tool for Analysing Structures at Different Scales , 1994 .

[15]  Sudeep Sarkar,et al.  Vision–IMU Integration Using a Slow-Frame-Rate Monocular Vision System in an Actual Roadway Setting , 2010, IEEE Transactions on Intelligent Transportation Systems.

[16]  Jung-Keun Lee,et al.  Integration of MEMS Inertial and Pressure Sensors for Vertical Trajectory Determination , 2015, IEEE Transactions on Instrumentation and Measurement.

[17]  Antonios Tsourdos,et al.  Inertial Navigation aided by Simultaneous Localization and Mapping , 2010, 2010 5th IEEE International Conference Intelligent Systems.

[18]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[19]  Sebastian Madgwick,et al.  Estimation of IMU and MARG orientation using a gradient descent algorithm , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[20]  D. Tegolo,et al.  Improving Harris corner selection strategy , 2011 .

[21]  Xiaoli Meng,et al.  Use of an Inertial/Magnetic Sensor Module for Pedestrian Tracking During Normal Walking , 2015, IEEE Transactions on Instrumentation and Measurement.

[22]  Wei Liang,et al.  Monocular camera and IMU integration for indoor position estimation , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[23]  Bin Li,et al.  A novel fusion methodology to bridge GPS outages for land vehicle positioning , 2015 .

[24]  Luyang Li,et al.  A Simple and Parallel Algorithm for Real-Time Robot Localization by Fusing Monocular Vision and Odometry/AHRS Sensors , 2014, IEEE/ASME Transactions on Mechatronics.

[25]  Zhiyu Xiang,et al.  Perception in Disparity: An Efficient Navigation Framework for Autonomous Vehicles With Stereo Cameras , 2015, IEEE Transactions on Intelligent Transportation Systems.

[26]  Pei-Chun Lin,et al.  Design and Implementation of a Nine-Axis Inertial Measurement Unit , 2012, IEEE/ASME Transactions on Mechatronics.

[27]  Peter Händel,et al.  Planar-Based Visual Inertial Navigation: Observability Analysis and Motion Estimation , 2015, J. Intell. Robotic Syst..

[28]  Paweł Pełczyński,et al.  Motion Vector Estimation of a Stereovision Camera with Inertial Sensors , 2012 .

[29]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[30]  Byung-Ju Yi,et al.  Kinematic Analysis and Motion Planning for a Planar Multiarticulated Omnidirectional Mobile Robot , 2015, IEEE/ASME Transactions on Mechatronics.

[31]  Holger Blume,et al.  Reliable orientation estimation for mobile motion capturing in medical rehabilitation sessions based on inertial measurement units , 2014, Microelectron. J..

[32]  Magnus Jansson,et al.  Vision-Aided Inertial Navigation Based on Ground Plane Feature Detection , 2014, IEEE/ASME Transactions on Mechatronics.