Robust stereo ego-motion for long distance navigation

Several methods for computing observer motion from monocular and stereo image sequences have been proposed. However, accurate positioning over long distances requires a higher level of robustness than previously achieved. This paper describes several mechanisms for improving robustness in the context of a maximum-likelihood stereo ego-motion method. We demonstrate that even a robust system will accumulate super-linear error in the distance traveled due to increasing orientation errors. However, when an absolute orientation sensor is incorporated, the growth is reduced to linear in the distance traveled, grows much more slowly in practice. Our experiments, including a trial with 210 stereo pairs, indicate that these techniques can achieve errors below 1% of the distance traveled. This method has been implemented to run on-board a prototype Mars rover.

[1]  Subhasis Chaudhuri,et al.  Recursive Estimation of Motion Parameters , 1996, Comput. Vis. Image Underst..

[2]  S. Shafer,et al.  Dynamic stereo vision , 1989 .

[3]  Simon Lacroix,et al.  Rover Self Localization in Planetary-Like Environments , 1999 .

[4]  Harpreet S. Sawhney,et al.  Correlation-based estimation of ego-motion and structure from motion and stereo , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[5]  Zhengyou Zhang,et al.  Estimation of Displacements from Two 3-D Frames Obtained From Stereo , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Larry H. Matthies,et al.  Error modeling in stereo navigation , 1986, IEEE J. Robotics Autom..

[7]  Paul R. Cohen,et al.  Motion and structure estimation from stereo image sequences , 1992, IEEE Trans. Robotics Autom..

[8]  Carlo Tomasi,et al.  Direction of heading from image deformations , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.