WEAR++: 3D model driven camera tracking on board the International Space Station

We present WEAR++, a wearable augmented reality system consisting of a head mounted display, a camera and an inertial measurement unit. This paper focuses on the visual camera tracking system developed for WEAR++. Using a 3D model of the scene, we first create a map of 3D-2D correspondences in an off line mapping procedure. During on line operation, we match features from a new image to the database, and track the camera pose with an Extended Kalman Filter using the recovered 3D-2D correspondences. By using robust local features (SURF) and a frustum culling algorithm, we demonstrate that we are able to track the pose even for jerky motions and blurry images. Furthermore, we explain how the system was utilised by astronaut Frank De Winne on board the International Space Station for performing maintenance tasks.

[1]  Nassir Navab,et al.  Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[2]  Jong-Il Park,et al.  Camera tracking using partially modeled 3-D objects with scene textures , 2011, 2011 IEEE International Symposium on VR Innovation.

[3]  Éric Marchand,et al.  Real-time markerless tracking for augmented reality: the virtual visual servoing framework , 2006, IEEE Transactions on Visualization and Computer Graphics.

[4]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[5]  Alissa Nicole Antle,et al.  StitchRV: multi-camera fiducial tracking , 2010, TEI '10.

[6]  Gabriel Zachmann,et al.  Geometric data structures for computer graphics , 2002, Eurographics.

[7]  David G. Lowe,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004, International Journal of Computer Vision.

[8]  Éric Marchand,et al.  Real-time Hybrid Tracking using Edge and Texture Information , 2007, Int. J. Robotics Res..

[9]  Han Wang,et al.  Real-Time 3D Motion Tracking with Known Geometric Models , 1999, Real Time Imaging.

[10]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.

[11]  Khoi Nguyen,et al.  Computer-vision-based registration techniques for augmented reality , 1996, Other Conferences.

[12]  T. Drummond,et al.  Going out : Robust Tracking for Outdoor Augmented Reality , 2006 .

[13]  Robert M. Haralick,et al.  Review and analysis of solutions of the three point perspective pose estimation problem , 1994, International Journal of Computer Vision.

[14]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[15]  Andrew Calway,et al.  Real-Time Camera Tracking Using Known 3D Models and a Particle Filter , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[16]  Greg Welch,et al.  High-Performance Wide-Area Optical Tracking: The HiBall Tracking System , 2001, Presence: Teleoperators & Virtual Environments.

[17]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision: Scene planes and homographies , 2004 .

[18]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[19]  Tom Davis,et al.  OpenGL(R) Programming Guide: The Official Guide to Learning OpenGL(R), Version 2 (5th Edition) (OpenGL) , 2005 .

[20]  Tom Drummond,et al.  Going out: robust model-based tracking for outdoor augmented reality , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[21]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.