Motion Parallax without Motion Compensation in 3D Cluttered Scenes

When an observer moves through a rigid 3D scene, points that are near to the observer move with a different image velocity than points that are far away. The difference between image velocity vectors is the direction of motion parallax. This direction vector points towards the observer's translation direction. Hence estimates of the direction of motion parallax are useful for estimating the observer's translation direction. Standard ways to compute the direction of motion parallax either rely on precomputed optical flow, or rely on motion compensation to remove the local image shift caused by observer rotation. Here we present a simple Fourier-based method for estimating the direction of motion parallax directly, that does not require optical flow and motion compensation. The method is real-time and performs accurately for image regions in which multiple motions are present.

[1]  V. Quarles,et al.  Department of Electrical Engineering and Computer Science , 1994 .

[2]  Yair Weiss,et al.  Smoothness in layers: Motion segmentation using nonparametric mixture estimation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[3]  Michael J. Black,et al.  The Robust Estimation of Multiple Motions: Parametric and Piecewise-Smooth Flow Fields , 1996, Comput. Vis. Image Underst..

[4]  P. Anandan,et al.  Hierarchical Model-Based Motion Estimation , 1992, ECCV.

[5]  David J. Heeger,et al.  Optical flow from spatialtemporal filters , 1987 .

[6]  Michal Irani,et al.  Recovery of Ego-Motion Using Region Alignment , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Alex Pentland,et al.  Cooperative Robust Estimation Using Layers of Support , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Allan D. Jepson,et al.  Subspace methods for recovering rigid motion I: Algorithm and implementation , 2004, International Journal of Computer Vision.

[9]  J HeegerDavid,et al.  Subspace methods for recovering rigid motion I , 1992 .

[10]  Takeo Kanade,et al.  Optical Navigation by the Method of Differences , 1985, IJCAI.

[11]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[12]  Michael S. Langer,et al.  Principal Components Analysis of Optical Snow , 2004, BMVC.

[13]  Richard Mann,et al.  Spectrum analysis of motion parallax in a 3D cluttered scene and application to egomotion. , 2005, Journal of the Optical Society of America. A, Optics, image science, and vision.

[14]  A. Yuille,et al.  A model for the estimate of local image velocity by cells in the visual cortex , 1990, Proceedings of the Royal Society of London. B. Biological Sciences.

[15]  H. C. Longuet-Higgins,et al.  The interpretation of a moving retinal image , 1980, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[16]  Michael S. Langer,et al.  Optical Snow , 2003, International Journal of Computer Vision.

[17]  Kenji Mase,et al.  Unified computational theory for motion transparency and motion boundaries based on eigenenergy analysis , 1991, Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[18]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[19]  Shmuel Peleg,et al.  A Three-Frame Algorithm for Estimating Two-Component Image Motion , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  A J Ahumada,et al.  Model of human visual-motion sensing. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[21]  Chung-Lin Huang,et al.  Motion estimation method using a 3D steerable filter , 1995, Image Vis. Comput..

[22]  J H Rieger,et al.  Processing differential image motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.