Radon-based structure from motion without correspondences

We present a novel approach for the estimation of 3D-motion directly from two images using the Radon transform. We assume a similarity function defined on the cross-product of two images which assigns a weight to all feature pairs. This similarity function is integrated over all feature pairs that satisfy the epipolar constraint. This integration is equivalent to filtering the similarity function with a Dirac function embedding the epipolar constraint. The result of this convolution is a function of the five unknown motion parameters with maxima at the positions of compatible rigid motions. The breakthrough is in the realization that the Radon transform is a filtering operator: If we assume that images are defined on spheres and the epipolar constraint is a group action of two rotations on two spheres, then the Radon transform is a convolution/correlation integral. We propose a new algorithm to compute this integral from the spherical harmonics of the similarity and Dirac functions. The resulting resolution in the motion space depends on the bandwidth we keep from the spherical transform. The strength of the algorithm is in avoiding a commitment to correspondences, thus being robust to erroneous feature detection, outliers, and multiple motions. The algorithm has been tested in sequences of real omnidirectional images and it outperforms correspondence-based structure from motion.

[1]  Jitendra Malik,et al.  Motion segmentation and tracking using normalized cuts , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[2]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[3]  Andrew W. Fitzgibbon,et al.  Learning epipolar geometry from image sequences , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[4]  Stanley R. Deans,et al.  Hough Transform from the Radon Transform , 1981, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Y. Aloimonos,et al.  Direct Perception of Three-Dimensional Motion from Patterns of Visual Motion , 1995, Science.

[6]  Ingemar J. Cox,et al.  Motion without structure , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[7]  Kostas Daniilidis,et al.  Catadioptric Projective Geometry , 2001, International Journal of Computer Vision.

[8]  René Vidal,et al.  A Unified Algebraic Approach to 2-D and 3-D Motion Segmentation , 2004, ECCV.

[9]  Seth J. Teller,et al.  Scalable Extrinsic Calibration of Omni-Directional Image Networks , 2002, International Journal of Computer Vision.

[10]  Kostas Daniilidis,et al.  Catadioptric projective geometry: theory and applications , 2002 .

[11]  D. Healy,et al.  Computing Fourier Transforms and Convolutions on the 2-Sphere , 1994 .

[12]  Stefano Soatto,et al.  A semi-direct approach to structure from motion , 2003, The Visual Computer.

[13]  Shree K. Nayar,et al.  Catadioptric omnidirectional camera , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[14]  Shahriar Negahdaripour,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence , 2004 .

[15]  Stefano Soatto,et al.  A semi-direct approach to structure from motion , 2001, Proceedings 11th International Conference on Image Analysis and Processing.

[16]  Kostas Daniilidis,et al.  Understanding noise sensitivity in structure from motion , 1996 .

[17]  John Oliensis,et al.  A Critique of Structure-from-Motion Algorithms , 2000, Comput. Vis. Image Underst..

[18]  Berthold K. P. Horn,et al.  Direct methods for recovering motion , 1988, International Journal of Computer Vision.

[19]  Richard Szeliski,et al.  Direct methods for visual scene reconstruction , 1995, Proceedings IEEE Workshop on Representation of Visual Scenes (In Conjunction with ICCV'95).

[20]  Frank Dellaert,et al.  Structure from motion without correspondence , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).