Optical Flow Based Robot Obstacle Avoidance

In this paper we try to develop an algorithm for visual obstacle avoidance of autonomous mobile robot. The input of the algorithm is an image sequence grabbed by an embedded camera on the B21r robot in motion. Then, the optical flow information is extracted from the image sequence in order to be used in the navigation algorithm. The optical flow provides very important information about the robot environment, like: the obstacles disposition, the robot heading, the time to collision and the depth. The strategy consists in balancing the amount of left and right side flow to avoid obstacles, this technique allows robot navigation without any collision with obstacles. The robustness of the algorithm will be showed by some examples.

[1]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[2]  Shahriar Negahdaripour,et al.  A direct method for locating the focus of expansion , 1989, Comput. Vis. Graph. Image Process..

[3]  Giulio Sandini,et al.  Divergent stereo for robot navigation: learning from bees , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[4]  J. Tresilian Perceptual Information for the Timing of Interceptive Action , 1990, Perception.

[5]  Antonis A. Argyros,et al.  Combining central and peripheral vision for reactive robot navigation , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[6]  Alexandre Bernardino,et al.  Visual behaviours for binocular tracking , 1998, Robotics Auton. Syst..

[7]  Leslie Pack Kaelbling,et al.  Ecological Robotics , 1998, Adapt. Behav..