Real-time data fusion on tracking camera pose for direct visual guidance

To properly align objects in the real and virtual world in an augmented reality (AR) space, it is essential to keep tracking camera's exact 3D position and orientation, which is well known as the Registration problem. Traditional vision based or inertial sensor based solutions are mostly designed for well-structured environment, which is, however, unavailable for outdoor uncontrolled road navigation applications. This paper proposed a hybrid camera pose tracking system that combines vision, GPS and 3D inertial gyroscope technologies. The fusion approach is based on our PMM (parameterized model matching) algorithm, in which the road shape model is derived from the digital map referring to GPS absolute road position, and matches with road features extracted from the real image. Inertial data estimates the initial possible motion, and also serves as the relative tolerance to stabilize output. The algorithms proposed in this paper are validated with the experimental results of real road tests under different conditions and types of road.

[1]  Jitendra Malik,et al.  An integrated stereo-based approach to automatic vehicle guidance , 1995, Proceedings of IEEE International Conference on Computer Vision.

[2]  Michael Harrington,et al.  Constellation: a wide-range wireless motion-tracking system for augmented reality and virtual set applications , 1998, SIGGRAPH.

[3]  Keiichi Uchimura,et al.  Dynamical road modeling and matching for direct visual navigation , 2002, Sixth IEEE Workshop on Applications of Computer Vision, 2002. (WACV 2002). Proceedings..

[4]  Suya You,et al.  Fusion of vision and gyro tracking for robust augmented reality registration , 2001, Proceedings IEEE Virtual Reality 2001.

[5]  T. Ronald,et al.  Azuma A Survey of Augmented Reality , 2022 .

[6]  Berthold K. P. Horn,et al.  Direct methods for recovering motion , 1988, International Journal of Computer Vision.

[7]  Robert Hooke,et al.  `` Direct Search'' Solution of Numerical and Statistical Problems , 1961, JACM.

[8]  Andrea Fusiello,et al.  Model tracking for video-based virtual reality , 2001, Proceedings 11th International Conference on Image Analysis and Processing.

[9]  Ryutarou Ohbuchi,et al.  Merging virtual objects with the real world: seeing ultrasound imagery within the patient , 1992, SIGGRAPH.

[10]  Eric Foxlin,et al.  Inertial head-tracker sensor fusion by a complementary separate-bias Kalman filter , 1996, Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium.

[11]  Xinhua Zhuang,et al.  Pose estimation from corresponding point data , 1989, IEEE Trans. Syst. Man Cybern..

[12]  T S Huang,et al.  Two-view motion analysis: a unified algorithm. , 1986, Journal of the Optical Society of America. A, Optics and image science.

[13]  Martin Pellkofer,et al.  EMS-Vision: a perceptual system for autonomous vehicles , 2002, IEEE Trans. Intell. Transp. Syst..

[14]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[15]  Tyrone L. Vincent,et al.  An adaptive estimator for registration in augmented reality , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[16]  Alonzo Kelly Essential Kinematics for Autonomous Vehicles , 1994 .