Natural feature tracking on a mobile handheld Tablet

This paper presents a natural feature tracking system for object recognition in real-life environments. The system is based on a local keypoint descriptor method optimized and adapted to extract salient regions within the image. Each object in the gallery is characterized by keypoints and corresponding local descriptors. The method first identifies gallery object features in new images using nearest neighbor classification. It then estimates camera pose and augments the image with registered synthetic graphics. We describe the optimizations necessary to enable real-time performance on a mobile tablet. An experimental evaluation of the system in real environments demonstrates that the method is accurate and robust.

[1]  Dieter Schmalstieg,et al.  Real-Time Detection and Tracking for Augmented Reality on Mobile Phones , 2010, IEEE Transactions on Visualization and Computer Graphics.

[2]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[3]  G. Klinker,et al.  A fast and robust line-based optical tracker for augmented reality applications , 1999 .

[4]  Malik Mallem,et al.  Robust augmented reality tracking based visual pose estimation , 2006, ICINCO-RA.

[5]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Didier Stricker,et al.  Advanced tracking through efficient image processing and visual-inertial sensor fusion , 2008, 2008 IEEE Virtual Reality Conference.

[7]  Samir Otmane,et al.  An Evaluation of Camera Pose Methods for an Augmented Reality System: Application to Teaching Industrial Robots , 2013, Trans. Comput. Sci..

[8]  Malik Mallem,et al.  Handling Occlusions for Robust Augmented Reality Systems , 2010, EURASIP J. Image Video Process..

[9]  Malik Mallem,et al.  Vision-inertial tracking system for robust fiducials registration in augmented reality , 2009, 2009 IEEE Symposium on Computational Intelligence for Multimedia Signal and Vision Processing.

[10]  Eric Foxlin,et al.  Encoded LED system for optical trackers , 2005, Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05).

[11]  Vincent Lepetit,et al.  Combining edge and texture information for real-time accurate 3D camera tracking , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[12]  Eric Foxlin,et al.  Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[13]  Avinash C. Kak,et al.  A New Approach to the Use of Edge Extremities for Model-based Object Tracking , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[14]  Vincent Lepetit,et al.  Fast Keypoint Recognition in Ten Lines of Code , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  Jean-Yves Didier,et al.  A performance study for camera pose estimation using visual marker based tracking , 2010, Machine Vision and Applications.

[16]  Madjid Maidi,et al.  Active Contours Motion based on Optical Flow for Tracking in Augmented Reality , 2006 .

[17]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[18]  Gregory D. Hager,et al.  Fast and Globally Convergent Pose Estimation from Video Images , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  Madjid Maidi,et al.  Markerless tracking for mobile augmented reality , 2011, 2011 IEEE International Conference on Signal and Image Processing Applications (ICSIPA).