Improvement of feature matching in catadioptric images using gyroscope data

Most of vision-based algorithms for motion and localization estimation requires matching some interest points in a pair of images. After building feature correspondence, it is possible to estimate camera motion/localization using epipolar geometry. However feature matching is still a challenging problem because of time constraint or image variability for example. In several robotic applications, the camera rotation may be known thanks to a gyroscope or another orientation sensor. Therefore, in this paper, we aim to answer the following question: can the knowledge of rotation from a gyroscope be used to improve feature matching. To analyze this new approach of camera and gyroscope data fusion, we proceed in two steps. First, we rotationally align the images using rotation information of the gyroscope. And second, we compare the quality of feature matching in the original and rotationally aligned images. Experimental results on a real catadioptric sequence show that gyroscope data permits to sensibly improve the number of inliers according to epipolar geometry.

[1]  David G. Lowe,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004, International Journal of Computer Vision.

[2]  K. Satoh,et al.  A hybrid and linear registration method utilizing inclination constraint , 2005, Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05).

[3]  David Nistér,et al.  An efficient solution to the five-point relative pose problem , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Pascal Vasseur,et al.  A robust top-down approach for rotation estimation and vanishing points extraction by catadioptric vision in urban environment , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Christopher G. Harris,et al.  A Combined Corner and Edge Detector , 1988, Alvey Vision Conference.

[6]  Maxime Lhuillier Effective and Generic Structure from Motion using Angular Error , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[7]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[8]  Suya You,et al.  Fusion of vision and gyro tracking for robust augmented reality registration , 2001, Proceedings IEEE Virtual Reality 2001.

[9]  J. Gaspar,et al.  Omni-directional vision for robot navigation , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[10]  Helder Araújo,et al.  Geometric properties of central catadioptric line images and their application in calibration , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[12]  Jorge Dias,et al.  Relative Pose Calibration Between Visual and Inertial Sensors , 2007, Int. J. Robotics Res..

[13]  Fei-Bin Hsiao,et al.  Complete pose determination for low altitude unmanned aerial vehicle using stereo vision , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Kostas Daniilidis,et al.  Catadioptric Projective Geometry , 2001, International Journal of Computer Vision.

[15]  J. P. Lewis Fast Normalized Cross-Correlation , 2010 .

[16]  Shree K. Nayar,et al.  A theory of catadioptric image formation , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[17]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .