Online Registration of Dynamic Scenes using Video Extrapolation

An online process is proposed for video registration of dynamic scenes, such as scenes with dynamic textures or with moving objects. This process has three steps: (i) A few frames are assumed to be already registered. (ii) Using the registered frames, the next new frame is extrapolated. (iii) The actual new frame is registered to the extrapolated frame. Video extrapolation overcomes the bias introduced by dynamics in the scene, even when the dynamic regions cover almost the entire image. It can also overcome not only motion, but also many fluctuations in intensity. The traditional “brightness constancy” is now replaced with “dynamics constancy”.

[1]  Franklin C. Crow,et al.  Summed-area tables for texture mapping , 1984, SIGGRAPH.

[2]  P. Anandan,et al.  Hierarchical Model-Based Motion Estimation , 1992, ECCV.

[3]  P. Anandan,et al.  Robust multi-sensor image alignment , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[4]  Dani Lischinski,et al.  Texture Mixing and Texture Movie Synthesis Using Statistical Learning , 2001, IEEE Trans. Vis. Comput. Graph..

[5]  A. Fitzgibbon Stochastic rigidity: image registration for nowhere-static scenes , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[6]  Irfan A. Essa,et al.  Graphcut textures: image and video synthesis using graph cuts , 2003, ACM Trans. Graph..

[7]  Azriel Rosenfeld,et al.  Robust regression methods for computer vision: A review , 1991, International Journal of Computer Vision.

[8]  Eli Shechtman,et al.  Space-time video completion , 2004, CVPR 2004.

[9]  Stefano Soatto,et al.  Dynamic Textures , 2003, International Journal of Computer Vision.

[10]  René Vidal,et al.  Optical flow estimation & segmentation of multiple moving dynamic textures , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).