Omnidirectional visual odometry for a planetary rover

Position estimation for planetary rovers has been typically limited to odometry based on proprioceptive measurements such as the integration of distance traveled and measurement of heading change. Here we present and compare two methods of online visual odometry suited for planetary rovers. Both methods use omnidirectional imagery to estimate motion of the rover. One method is based on robust estimation of optical flow and subsequent integration of the flow. The second method is a full structure-from-motion solution. To make the comparison meaningful we use the same set of raw corresponding visual features for each method. The dataset is an sequence of 2000 images taken during a field experiment in the Atacama desert, for which high resolution GPS ground truth is available.

[1]  Lars Chittka,et al.  The spectral input to honeybee visual odometry , 2003, Journal of Experimental Biology.

[2]  Alex Pentland,et al.  Recursive Estimation of Motion, Structure, and Focal Length , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Arthur Gelb,et al.  Applied Optimal Estimation , 1974 .

[4]  Fumiya Iida,et al.  Biologically inspired visual odometer for navigation of a flying robot , 2003, Robotics Auton. Syst..

[5]  Clark F. Olson,et al.  Robust stereo ego-motion for long distance navigation , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[6]  R. Chellappa,et al.  Recursive 3-D motion estimation from a monocular image sequence , 1990 .

[7]  Sanjiv Singh,et al.  Extending shape-from-motion to noncentral onmidirectional cameras , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[8]  Mandyam V. Srinivasan,et al.  Visual computation of egomotion using an image interpolation technique , 2004, Biological Cybernetics.

[9]  N. Nathan Self and will , 1997 .

[10]  Peter I. Corke An inertial and visual sensing system for a small autonomous helicopter , 2004 .

[11]  Sanjiv Singh,et al.  Analysis and Design of Panoramic Stereo Vision Using Equi-Angular Pixel Cameras , 1999 .

[12]  Yiannis Aloimonos,et al.  New eyes for robotics , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[13]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[14]  Mandyam V. Srinivasan,et al.  Honeybee navigation: properties of the visually driven `odometer' , 2003, Journal of Experimental Biology.

[15]  Peter C. Cheeseman,et al.  Estimating uncertain spatial relationships in robotics , 1986, Proceedings. 1987 IEEE International Conference on Robotics and Automation.