Geometric Compensation of Dynamic Video Projections

Projector-camera systems, used in Spatial Augmented Reality, automatically adapt the video projections to the scene objects according to the visualization conditions. This paper introduces a novel non-invasive (without Structured Light) method based on a combination of traditional Feature Matching (FM) and more computationally effective Optical Flow (OF). It requires only one projected and one acquired image at a time in the most difficult case when both projected content and geometric transformations change every frame. It detects scene changes when OF fails and thus should be replaced by FM. In the experiments, we show that the method yields a more precise and less shaky compensation for different types of projected videos, and is up to 2.8 times faster than previous FM-based works.

[1]  Yongbeom Lee,et al.  Simultaneous Geometric and Radiometric Adaptation to Dynamic Surfaces With a Mobile Projector-Camera System , 2008, IEEE Transactions on Circuits and Systems for Video Technology.

[2]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[3]  Roland Siegwart,et al.  Kinect v2 for mobile robot navigation: Evaluation and modeling , 2015, 2015 International Conference on Advanced Robotics (ICAR).

[4]  Pierre-Marc Jodoin,et al.  Camera-projector matching using an unstructured video stream , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[5]  Hideo Saito,et al.  Colour Descriptors for Tracking in Spatial Augmented Reality , 2012, ACCV Workshops.

[6]  Jingwen Dai,et al.  Embedding Invisible Codes Into Normal Video Projection: Principle, Evaluation, and Applications , 2013, IEEE Transactions on Circuits and Systems for Video Technology.

[7]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[8]  Sergio Fernández Navarro,et al.  One-shot pattern projection for dense and accurate 3D reconstruction in structured light , 2012 .

[9]  Jong-Il Park,et al.  Subjective Evaluation on Visual Perceptibility of Embedding Complementary Patterns for Nonintrusive Projection-Based Augmented Reality , 2010, IEEE Transactions on Circuits and Systems for Video Technology.

[10]  Christian Jacquemin,et al.  Evaluation of color descriptors for projector-camera systems , 2016, J. Vis. Commun. Image Represent..

[11]  Joaquim Salvi,et al.  A state of the art in structured light patterns for surface profilometry , 2010, Pattern Recognit..

[12]  Karam Kim,et al.  ROBUST 2D AUGMENTED REALITY BASED ON HOMOGRAPHY REFINEMENT AND TEMPORAL COHERENCE , 2013 .