Real time localization of mobile robotic platform via fusion of Inertial and Visual Navigation System

Inertial Navigation System (INS) is one of the most important component of a mobile robotic platform, be it ground or air based. It is used to localize the mobile robotic platform in the real world and identify its location in terms of latitudes and longitudes or other related coordinate systems. Highly accurate and precise INS is quite expensive and is therefore not suitable for more general purpose applications. It is, therefore, a standard approach in mobile robotics to use a low grade commercial INS coupled with another navigation device to provide a more accurate triangulation. Generally, INS and Global Positioning System (GPS) are integrated using Kalman Filters to provide accurate localization information about the mobile robots. Although, in certain scenarios, the mobile robot is not able to acquire a GPS fix for long durations of time especially when navigating in indoor environments or in areas with inadequate GPS satellite coverage. In such cases, an additional source of location fix is required. This paper describes an accurate and stable data fusion filter which integrates the position of a mobile robot from a Visual Navigation System (VNS) with the position from an INS to accurately localize the robot in absence of GPS data. This research proposes a seven error states model and uses it in Kalman Filter for data fusion. The filter is tuned and tested using dynamic and static data from INS and VNS. Simulation and experimentation results show that the seven error states model based Kalman Filter provides a good balance between accuracy, robustness and processing efficiency for a real time implementation. Experiments also show that in absence of GPS data only a couple of fixes from the VNS are sufficient to quickly correct the position of the mobile robotic platform and three fixes at different times are sufficient for velocity correction of INS.