Autonomous navigation requires a trajectory planning module making use of sensory information to derive safe and efficient motion strategies. This paper describes a basic function for such a module, namely the ability to use occlusion monitoring and interpretation as a basis for defining exploration trajectories. The first part of the paper describes an occlusion detection method based on binocular vision, which assumes that occlusions between the left and right views cause reprojection errors when images from one of the viewpoints are used to predict images from the other viewpoint. More specifically, occlusions are shown to cause forward or backward jumps in the displacement of reprojected points. In the second part of the paper, occlusions are dynamically monitored for exploration purposes. The exposed and occluded views are identified, and occlusions are reduced by moving the vehicle towards the exposed view. Occlusion width alone is used to determine occlusion equations, which can be solved to obtain depth information about the scene.
[1]
Amnon Shashua,et al.
Algebraic Functions For Recognition
,
1995,
IEEE Trans. Pattern Anal. Mach. Intell..
[2]
Ronen Basri,et al.
Recognition by Linear Combinations of Models
,
1991,
IEEE Trans. Pattern Anal. Mach. Intell..
[3]
A. Shashua.
On Geometric and Algebraic Aspects of 3 D Affine and Projective Structures from Perspective 2 D Views
,
.
[4]
Wilfried Enkelmann,et al.
Obstacle detection by evaluation of optical flow fields from image sequences
,
1990,
Image Vis. Comput..
[5]
R. Nevatia,et al.
Use of monocular groupings and occlusion analysis in a hierarchical stereo system
,
1991,
Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.
[6]
Nicola Ancona.
A Fast Obstacle Detection Method based on Optical Flow
,
1992,
ECCV.