A Flexible Software Architecture for Hybrid Tracking

Fusion of vision-based and inertial pose estimation has many high-potential applications in navigation, robotics, and augmented reality. Our research aims at the development of a fully mobile, completely self-contained tracking system, that is able to estimate sensor motion from known 3D scene structure. This requires a highly modular and scalable software architecture for algorithm design and testing. As the main contribution of this paper, we discuss the design of our hybrid tracker and emphasize important features: scalability, code reusability, and testing facilities. In addition, we present a mobile augmented reality application, and several first experiments with a fully mobile vision-inertial sensor head. Our hybrid tracking system is not only capable of real-time performance, but can also be used for offline analysis of tracker performance, comparison with ground truth, and evaluation of several pose estimation and information fusion algorithms. © 2004 Wiley Periodicals, Inc.

[1]  Suya You,et al.  Fusion of vision and gyro tracking for robust augmented reality registration , 2001, Proceedings IEEE Virtual Reality 2001.

[2]  Gregory D. Hager,et al.  X Vision: A Portable Substrate for Real-Time Vision Applications , 1998, Comput. Vis. Image Underst..

[3]  Ivan Poupyrev,et al.  The MagicBook: a transitional AR interface , 2001, Comput. Graph..

[4]  Camillo J. Taylor,et al.  Camera trajectory estimation using inertial sensor measurements and structure from motion results , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[5]  Hideyuki Tamura,et al.  A hybrid registration method for outdoor augmented reality , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[6]  Axel Pinz,et al.  A new optical tracking system for virtual and augmented reality applications , 2001, IMTC 2001. Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference. Rediscovering Measurement in the Age of Informatics (Cat. No.01CH 37188).

[7]  Gregory D. Hager,et al.  Fast and Globally Convergent Pose Estimation from Video Images , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Manmohan Krishna Chandraker,et al.  Real-Time Camera Pose in a Room , 2003, ICVS.

[9]  Harald Ganster,et al.  Hybrid Tracking for outdoor ar applications , 2002 .

[10]  Harald Ganster,et al.  Hybrid Tracking for Outdoor Augmented Reality Applications , 2002, IEEE Computer Graphics and Applications.

[11]  Luc Van Gool,et al.  Markerless augmented reality with a real-time affine region tracker , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[12]  Michael Gervautz,et al.  The Personal Interaction Panel – a Two‐Handed Interface for Augmented Reality , 1997, Comput. Graph. Forum.

[13]  Kostas Daniilidis,et al.  Linear Pose Estimation from Points or Lines , 2002, ECCV.

[14]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[15]  Wolfgang Ponweiser,et al.  RobVision: vision based navigation for mobile robots , 2001, Conference Documentation International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI 2001 (Cat. No.01TH8590).

[16]  Dieter Schmalstieg,et al.  An open software architecture for virtual reality interaction , 2001, VRST '01.

[17]  Ronald Azuma,et al.  A motion-stabilized outdoor augmented reality system , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[18]  Eric Foxlin,et al.  Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.