3D Object Surface Tracking Using Partial Shape Templates Trained from a Depth Camera for Spatial Augmented Reality Environments

We present a 3D object tracking method using a single depth camera for Spatial Augmented Reality (SAR). The drastic change of illumination in a SAR environment makes object tracking difficult. Our method uses a depth camera to train and track the 3D physical object. The training allows maker-less tracking of the moving object under illumination changes. The tracking is a combination of feature based matching and frame sequential matching of point clouds. Our method allows users to adapt 3D objects of their choice into a dynamic SAR environment.

[1]  Masatoshi Okutomi,et al.  Direct image alignment of projector-camera systems with planar surfaces , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[2]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[3]  Tamim Asfour,et al.  6-DoF model-based tracking of arbitrarily shaped 3D objects , 2011, 2011 IEEE International Conference on Robotics and Automation.

[4]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Nico Blodow,et al.  Fast Point Feature Histograms (FPFH) for 3D registration , 2009, 2009 IEEE International Conference on Robotics and Automation.

[6]  Ramesh Raskar,et al.  Dynamic shader lamps : painting on movable objects , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.