Model-directed mobile robot navigation

The authors report on the system and methods used by UMass Mobile Robot Project. Model-based processing of the visual sensory data is the primary mechanism used for controlling movement of an autonomous land vehicle through the environment, measuring progress towards a given goal, and avoiding obstacles. Goal-oriented navigation takes place through a partially modeled, unchanging environment that contains no unmodeled obstacles; this simplified environment provides a foundation for research in more complicated domains. The navigation system integrates perception, planning, and execution of actions. Of particular importance is that the planning processes are reactive and reason about landmarks that should be perceived at various stages of task execution. Correspondence between image features and expected landmark locations are used at several abstraction levels to ensure proper plan execution. The system and some experiments that demonstrate the performance of its components is described. >

[1]  Volker Graefe,et al.  Applications of dynamic monocular machine vision , 1988, Machine Vision and Applications.

[2]  James S. Albus,et al.  Overview of the multiple autonomous underwater vehicles (MAUV) project , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[3]  M. Herman,et al.  Planning and world modeling for autonomous undersea vehicles , 1988, Proceedings IEEE International Symposium on Intelligent Control 1988.

[4]  Allen R. Hanson,et al.  Extracting Straight Lines , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Volker Graefe,et al.  Dynamic monocular machine vision , 1988, Machine Vision and Applications.

[6]  Steven A. Shafer,et al.  An architecture for sensor fusion in a mobile robot , 1986, Proceedings. 1986 IEEE International Conference on Robotics and Automation.

[7]  Allen R. Hanson,et al.  Segmenting images using localized histograms and region merging , 1987, International Journal of Computer Vision.

[8]  J. Ross Beveridge,et al.  Optimization of 2-dimensional model matching , 1989 .

[9]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[10]  Roger Y. Tsai,et al.  Techniques for Calibration of the Scale Factor and Image Center for High Accuracy 3-D Machine Vision Metrology , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  R. Weiss,et al.  Combinatorial optimization applied to variable scale 2D model matching , 1990, [1990] Proceedings. 10th International Conference on Pattern Recognition.

[12]  Matthew Turk,et al.  The Autonomous Land Vehicle (ALV) Preliminary Road-Following Demonstration , 1985, Other Conferences.

[13]  Claude L. Fennema,et al.  Planning With Perceptual Milestones To Control Uncertainty In Robot Navigation , 1989, Other Conferences.

[14]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[15]  Rakesh Kumar,et al.  Determination of camera location and orientation , 1989 .

[16]  Olivier D. Faugeras,et al.  Determination of camera location from 2D to 3D line and point correspondences , 1988, Proceedings CVPR '88: The Computer Society Conference on Computer Vision and Pattern Recognition.

[17]  R. Manmatha,et al.  Issues in extracting motion parameters and depth from approximate translational motion , 1988, [1989] Proceedings. Workshop on Visual Motion.

[18]  Sundaram Ganapathy,et al.  Decomposition of transformation matrices for robot vision , 1984, Pattern Recognit. Lett..

[19]  Allen R. Hanson,et al.  The image understanding architecture , 1987, International Journal of Computer Vision.

[20]  Takeo Kanade,et al.  Autonomous land vehicle project at CMU , 1986, CSC '86.

[21]  A. R. Hanson,et al.  Robust estimation of camera location and orientation from noisy data having outliers , 1989, [1989] Proceedings. Workshop on Interpretation of 3D Scenes.

[22]  Larry S. Davis,et al.  Pose Determination of a Three-Dimensional Object Using Triangle Pairs , 1988, IEEE Trans. Pattern Anal. Mach. Intell..