Sticky projections — A new approach to interactive shader lamp tracking

Shader lamps can augment physical objects with projected virtual replications using a camera-projector system, provided that the physical and virtual object are well registered. Precise registration and tracking has been a cumbersome and intrusive process in the past. In this paper, we present a new method for tracking arbitrarily shaped physical objects interactively. In contrast to previous approaches our system is mobile and makes solely use of the projection of the virtual replication to track the physical object and “stick” the projection to it. Our method consists of two stages, a fast pose initialization based on structured light patterns and a non-intrusive frame-by-frame tracking based on features detected in the projection. In the initialization phase a dense point cloud of the physical object is reconstructed and precisely matched to the virtual model to perfectly overlay the projection. During the tracking phase, a radiometrically corrected virtual camera view based on the current pose prediction is rendered and compared to the captured image. Matched features are triangulated providing a sparse set of surface points that is robustly aligned to the virtual model. The alignment transformation serves as an input for the new pose prediction. Quantitative experiments show that our approach can robustly track complex objects at interactive rates.

[1]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[2]  Ruigang Yang,et al.  Automatic and Continuous Projector Display Surface Estimation Using Everyday Imagery , 2001, WSCG.

[3]  Marc Levoy,et al.  Efficient variants of the ICP algorithm , 2001, Proceedings Third International Conference on 3-D Digital Imaging and Modeling.

[4]  Greg Welch,et al.  A general approach for closed-loop registration in AR , 2012, 2013 IEEE Virtual Reality (VR).

[5]  Reinhard Koch,et al.  Visualisation Techniques for Using Spatial Augmented Reality in the Design Process of a Car , 2011, Comput. Graph. Forum.

[6]  Roel Vertegaal,et al.  DisplayObjects: prototyping functional physical interfaces on 3d styrofoam, paper or cardboard models , 2010, TEI '10.

[7]  Michitaka Hirose,et al.  Projected augmentation - augmented reality using rotatable video projectors , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[8]  Kok-Lim Low Linear Least-Squares Optimization for Point-to-Plane ICP Surface Registration , 2004 .

[9]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[10]  Henry Fuchs,et al.  Real-Time Projector Tracking on Complex Geometry Using Ordinary Imagery , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Greg Welch,et al.  Shader Lamps: Animating Real Objects With Image-Based Illumination , 2001, Rendering Techniques.

[12]  Ramesh Raskar,et al.  Cartoon dioramas in motion , 2005, SIGGRAPH Courses.

[13]  Sander Oude Elberink,et al.  Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications , 2012, Sensors.

[14]  Paul A. Beardsley,et al.  Natural video matting using camera arrays , 2006, ACM Trans. Graph..

[15]  Ramesh Raskar,et al.  Dynamic shader lamps : painting on movable objects , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[16]  John Underkoffler,et al.  A view from the Luminous Room , 1997, Personal Technologies.

[17]  J. Hebert,et al.  Geometric Calibration of a Structured Light System Using Circular Control Points , 2008 .

[18]  Markus H. Gross,et al.  Embedding imperceptible patterns into projected images for simultaneous acquisition and display , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[19]  David Molyneaux,et al.  Cooperatively Augmenting Smart Objects with Projector-Camera Systems , 2006 .