A paradigm for incorporating vision in the robot navigation function

The authors present a paradigm for combining vision with motion planning. It turns out that extensive modifications of so-called tactile algorithms are needed to take full advantage of the additional sensing capabilities, while not sacrificing the algorithm convergence. Different design principles can be introduced that result in algorithm versions exhibiting different styles of behavior and producing different paths, without, in general, being superior to each other.<<ETX>>

[1]  Larry H. Matthies,et al.  Error modeling in stereo navigation , 1986, IEEE J. Robotics Autom..

[2]  V. Lumelsky,et al.  Dynamic path planning for a mobile automaton with limited information on the environment , 1986 .

[3]  Chee Yap,et al.  Algorithmic motion planning , 1987 .

[4]  Martial Hebert,et al.  Outdoor scene analysis using range data , 1986, Proceedings. 1986 IEEE International Conference on Robotics and Automation.

[5]  David J. Kriegman,et al.  A Mobile Robot: Sensing, Planning and Locomotion , 1990, Autonomous Robot Vehicles.

[6]  S. Sitharama Iyengar,et al.  On terrain acquisition by a finite-sized mobile robot in plane , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[7]  Vladimir J. Lumelsky,et al.  Algorithmic and complexity issues of robot motion in an uncertain environment , 1987, J. Complex..