Experiments with driving modes for urban robots

In this paper, we describe experiments on semi-autonomous control of a small urban robot. Three driving modes allow semi-autonomous control of the robot through video imagery, or by using partial maps of the environment. Performance is analyzed in terms of maximum speed, terrain roughness, environmental conditions, and ease of control. We concentrate the discussion on a driving mode based on visual servoing. In this mode, a template designated in an image is tracked as the robot moves toward the destination designated by the operator. Particular attention is given to the robustness of the tracking with respect to template selection, computational resources, occlusions, and rough motion. The discussion of algorithm performance is based on experiments conducted at Ft. Sam Houston, TX, on Jul. 5-9 1999. In addition to the driving modes themselves, the performance and practicality of an omnidirectional imaging sensor is discussed. In particular, we discuss the typical imaging artifacts due to ambient lighting.

[1]  Charles E. Thorpe,et al.  An examination of the STRIPE vehicle teleoperation system , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[2]  Shree K. Nayar,et al.  Folded catadioptric cameras , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[3]  Maria Bualat,et al.  Initial results from vision-based control of the Ames Marsokhod rover , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[4]  Douglas W. Gage Evolutionary strategy for achieving autonomous navigation , 1999, Other Conferences.

[5]  Gregory D. Hager,et al.  Robot Navigation Using Image Sequences , 1996, AAAI/IAAI, Vol. 2.

[6]  Anthony Stentz Best Information Planning for Unknown, Uncertain, and Changing Domains , 1997 .

[7]  Gregory D. Hager,et al.  Incremental focus of attention for robust visual tracking , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[8]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[9]  Peter I. Corke,et al.  Dynamic effects in high-performance visual servoing , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[10]  David Kortenkamp,et al.  Using stereo vision to pursue moving agents with a mobile robot , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[11]  Shree K. Nayar,et al.  Catadioptric omnidirectional camera , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.