Can Lucas-Kanade be used to estimate motion parallax in 3D cluttered scenes?

When an observer moves in a 3D static scene, the motion field depends on the depth of the visible objects and on the observer's instantaneous translation and rotation. By computing the difference between nearby motion field vectors, the observer can estimate the direction of local motion parallax and in turn the direction of heading. It has recently been argued that, in 3D cluttered scenes such as a forest, computing local image motion using classical optical flow methods is problematic since these classical methods have problems at depth discontinuities. Hence, estimating local motion parallax from optical flow should be problematic as well. In this paper we evaluate this claim. We use the classical Lucas-Kanade method to estimate optical flow and the Rieger-Lawton method to estimate the direction of motion parallax from the estimated flow. We compare the motion parallax estimates to those of the frequency based method of Mann-Langer. We find that if the Lucas-Kanade estimates are sufficiently pruned, using both an eigenvalue condition and a mean absolute error condition, then the Lucas- Kanade/Rieger-Lawton method can perform as well as or better than the frequency-based method.

[1]  Michael S. Langer,et al.  Estimating camera motion through a 3D cluttered scene , 2004, First Canadian Conference on Computer and Robot Vision, 2004. Proceedings..

[2]  J. Weickert,et al.  Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods , 2005 .

[3]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[4]  Michael J. Black,et al.  The Robust Estimation of Multiple Motions: Parametric and Piecewise-Smooth Flow Fields , 1996, Comput. Vis. Image Underst..

[5]  Emanuele Trucco,et al.  Introductory techniques for 3-D computer vision , 1998 .

[6]  Simon Baker,et al.  Lucas-Kanade 20 Years On: A Unifying Framework , 2004, International Journal of Computer Vision.

[7]  Shmuel Peleg,et al.  A Three-Frame Algorithm for Estimating Two-Component Image Motion , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[9]  Yair Weiss,et al.  Smoothness in layers: Motion segmentation using nonparametric mixture estimation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[10]  Allan D. Jepson,et al.  Subspace methods for recovering rigid motion I: Algorithm and implementation , 2004, International Journal of Computer Vision.

[11]  William B. Thompson,et al.  Analysis of Accretion and Deletion at Boundaries in Dynamic Scenes , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Takeo Kanade,et al.  Optical Navigation by the Method of Differences , 1985, IJCAI.

[13]  Richard Mann,et al.  Spectrum analysis of motion parallax in a 3D cluttered scene and application to egomotion. , 2005, Journal of the Optical Society of America. A, Optics, image science, and vision.

[14]  H. C. Longuet-Higgins,et al.  The interpretation of a moving retinal image , 1980, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[15]  J HeegerDavid,et al.  Subspace methods for recovering rigid motion I , 1992 .

[16]  J H Rieger,et al.  Processing differential image motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.