Predicting and estimating the accuracy of n-occular optical tracking systems

Marker-based optical tracking systems are widely used in augmented reality, medical navigation and industrial applications. We propose a model for the prediction of the target registration error (TRE) in these kinds of tracking systems by estimating the fiducial location error (FLE) from two-dimensional errors on the image plane and propagating that error to a given point of interest. We have designed a set of experiments in order to estimate the actual parameters of the model for any given tracking system. We present the results of a study which we used to demonstrate the effect of different sources of error. The method is applied to real applications to show the usefulness for any kind of augmented reality system. We also present a set of tools that can be used to visualize the accuracy at design time.

[1]  Tyrone L. Vincent,et al.  Analysis of Head Pose Accuracy in Augmented Reality , 2000, IEEE Trans. Vis. Comput. Graph..

[2]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[3]  Larry S. Davis,et al.  Predicting accuracy in pose estimation for marker-based tracking , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[4]  Larry S. Davis,et al.  A method for designing marker-based tracking probes , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[5]  Ravin Balakrishnan,et al.  Reaching for objects in VR displays: lag and frame rate , 1994, TCHI.

[6]  Nassir Navab,et al.  Visual marker detection and decoding in AR systems: a comparative study , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[7]  D. E. Manolakis,et al.  Efficient solution and performance analysis of 3-D position estimation by trilateration , 1996 .

[8]  Nassir Navab,et al.  Recovering projection geometry: how a cheap camera can outperform an expensive stereo system , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[9]  Nassir Navab,et al.  Navigated Three Dimensional Beta Probe for Optimal Cancer Resection , 2006, MICCAI.

[10]  WareColin,et al.  Reaching for objects in VR displays , 1994 .

[11]  Paul R. Cohen,et al.  Camera Calibration with Distortion Models and Accuracy Evaluation , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Xing Chen,et al.  Design of many-camera tracking systems for scalability and efficient resource allocation , 2002 .

[13]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[14]  Ronald Azuma,et al.  Improving static and dynamic registration in an optical see-through HMD , 1994, SIGGRAPH.

[15]  Frank Sauer,et al.  An Augmented Reality Navigation System with a Single-Camera Tracker: System Design and Needle Biopsy Phantom Trial , 2002, MICCAI.

[16]  Torsten Kuhlen,et al.  Automatic Multi-Camera Setup Optimization for Optical Tracking , 2006, IEEE Virtual Reality Conference (VR 2006).

[17]  Greg Welch,et al.  A general method for comparing the expected performance of tracking and motion capture systems , 2005, VRST '05.

[18]  R. Mohr,et al.  What accuracy for 3D measurements with cameras? , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[19]  M. Smith Close range photogrammetry and machine vision , 1997 .

[20]  Blair MacIntyre,et al.  OSGAR: a scene graph with uncertain transformations , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[21]  Jay B. West,et al.  Predicting error in rigid-body point-based registration , 1998, IEEE Transactions on Medical Imaging.