Free Space Detection from Catadioptric Omnidirectional Images for Visual Navigation using Optical Flow

Institute of Media and Information Technology, Chiba University, JapanYayoicho 1-33, Inage-ku, Chiba, 263-8522, JapanAbstract. In this paper, we develop a free space detection algorithm forthe visual navigation of the autonomous robot mounting a catadioptiricomnidirectional imaging system. The algorithm detects the dominantplane as the free space from a sequence of omnidirectional images cap-tured by a camera mounted on the autonomous robot. The dominantplane, which can be detected from the optical-°ow ¯eld, is the largestplanar area in the image. For the detection of the dominant plane fromthe optical-°ow ¯eld, we adopt the motion separation property, that is,the optical-°ow vector is decomposed into in¯nitesimal rotation, trans-lation, and divergent motions on the images. The algorithm matchesthe measured translation optical-°ow ¯eld with the template translationoptical-°ow ¯eld to separate the dominant-plane as the free space forthe navigation and the obstacle area. The template optical-°ow ¯eld isgenerated from a preobserved image sequence without any calibration ofthe internal parameters of both the robot and camera.

[1]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[2]  J. Gaspar,et al.  Omni-directional vision for robot navigation , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[3]  Avinash C. Kak,et al.  Vision for Mobile Robot Navigation: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Roland Siegwart,et al.  Robot Navigation by Panoramic Vision and Attention Guided Fetaures , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[5]  Shree K. Nayar,et al.  Catadioptric omnidirectional camera , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[6]  C. Laugier,et al.  Real-time moving obstacle detection using optical flow models , 2006, 2006 IEEE Intelligent Vehicles Symposium.

[7]  Atsushi Imiya,et al.  Dominant plane detection from optical flow for robot navigation , 2006, Pattern Recognit. Lett..

[8]  Geoffrey L. Barrows,et al.  Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[9]  Atsushi Imiya,et al.  Featureless robot navigation using optical flow , 2005, Connect. Sci..

[10]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[11]  Joao P. Barreto,et al.  Unifying Image Plane Liftings for Central Catadioptric and Dioptric Cameras , 2006 .

[12]  Holger G. Krapp,et al.  Insect-Inspired Estimation of Egomotion , 2004, Neural Computation.

[13]  Andrew Vardy,et al.  Biologically plausible visual homing methods based on optical flow techniques , 2005, Connect. Sci..

[14]  P. J. Sobey Active navigation with a monocular robot , 1994, Biological Cybernetics.

[15]  Giulio Sandini,et al.  Uncalibrated obstacle detection using normal flow , 2005, Machine Vision and Applications.

[16]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[17]  José Santos-Victor,et al.  Vision-based navigation and environmental representations with an omnidirectional camera , 2000, IEEE Trans. Robotics Autom..

[18]  Hanspeter A. Mallot,et al.  Biomimetic robot navigation , 2000, Robotics Auton. Syst..

[19]  Eduardo Bayro-Corrochano,et al.  Omnidirectional Vision and Invariant Theory for Robot Navigation Using Conformal Geometric Algebra , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[20]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.