A high-performance tracking system based on camera and IMU

We consider an indoor tracking system consisting of an inertial measurement unit (IMU) and a camera that detects markers in the environment. There are many camera based tracking systems described in literature and available commercially, and a few of them also has support from IMU. These are based on the best-effort principle, where the performance varies depending on the situation. In contrast to this, we start with a specification of the system performance, and the design is based on an information theoretic approach, where specific user scenarios are defined. Precise models for the camera and IMU are derived for a fusion filter, and the theoretical Cramér-Rao lower bound and the Kalman filter performance are evaluated. In this study, we focus on examining the camera quality versus the marker density needed to get at least a one mm and one degree accuracy in tracking performance.

[1]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.

[2]  Paul E. Debevec,et al.  Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography , 1998, SIGGRAPH '08.

[3]  Magnus Jobs,et al.  Accurate and reliable soldier and first responder indoor positioning: multisensor systems and cooperative localization , 2011, IEEE Wireless Communications.

[4]  Hojung Cha,et al.  Smartphone-Based Collaborative and Autonomous Radio Fingerprinting , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[5]  Jukka Vanhala,et al.  A portable and low-cost 3D tracking system using four-point planar square calibration , 2012, 2012 International Conference on Indoor Positioning and Indoor Navigation (IPIN).

[6]  Patrick Doherty,et al.  Vision-based pose estimation for autonomous indoor navigation of micro-scale Unmanned Aircraft Systems , 2010, 2010 IEEE International Conference on Robotics and Automation.

[7]  Xiaoji Niu,et al.  Analysis and Modeling of Inertial Sensors Using Allan Variance , 2008, IEEE Transactions on Instrumentation and Measurement.

[8]  Anders Ynnerman,et al.  Free Form Incident Light Fields , 2008, Comput. Graph. Forum.

[9]  Songde Ma,et al.  A complete two-plane camera calibration method and experimental comparisons , 1993, 1993 (4th) International Conference on Computer Vision.

[10]  Thomas B. Schön,et al.  Robust real-time tracking by fusing measurements from inertial and vision sensors , 2007, Journal of Real-Time Image Processing.

[11]  Carlos H. Muravchik,et al.  Posterior Cramer-Rao bounds for discrete-time nonlinear filtering , 1998, IEEE Trans. Signal Process..

[12]  P. Handel,et al.  Realtime implementation of visual-aided inertial navigation using epipolar constraints , 2012, Proceedings of the 2012 IEEE/ION Position, Location and Navigation Symposium.

[13]  Thomas B. Schön,et al.  Modeling and Calibration of Inertial and Vision Sensors , 2010, Int. J. Robotics Res..

[14]  Moustafa Youssef,et al.  No need to war-drive: unsupervised indoor localization , 2012, MobiSys '12.

[15]  Paul R. Cohen,et al.  Camera Calibration with Distortion Models and Accuracy Evaluation , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Mark Fiala,et al.  ARTag, a fiducial marker system using digital techniques , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[17]  Reinhard Koch,et al.  Realtime Camera Tracking in the MATRIS Project , 2007, SMPTE Motion Imaging Journal.

[18]  Major J. Beich,et al.  Tightly-coupled image-aided inertial relative navigation using Statistical Predictive Rendering (SPR) techniques and a priori world Models , 2010, IEEE/ION Position, Location and Navigation Symposium.

[19]  CohenPaul,et al.  Camera Calibration with Distortion Models and Accuracy Evaluation , 1992 .