Corridor Navigation and Obstacle Avoidance using Visual Potential for Mobile Robot

In this paper, we develop an algorithm for corridor navigation and obstacle avoidance using visual potential for visual navigation by an autonomous mobile robot. The robot is equipped with a camera system which dynamically captures the environment. The visual potential is computed from an image sequence and optical flow computed from successive images captured by the camera mounted on the robot. Our robot selects a local pathway using the visual potential observed through its vision system. Our algorithm enables mobile robots to avoid obstacles without any knowledge of a robot workspace. We demonstrate experimental results using image sequences observed with a moving camera in a simulated environment and a real environment. Our algorithm is robust against the fluctuation of displacement caused by mechanical error of the mobile robot, and the fluctuation of planar-region detection caused by a numerical error in the computation of optical flow.

[1]  Yoji Kuroda,et al.  Potential Field Navigation of High Speed Unmanned Ground Vehicles on Uneven Terrain , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[2]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[3]  Howie Choset,et al.  Composition of local potential functions for global robot control and navigation , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[4]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[5]  Kimon P. Valavanis,et al.  Mobile robot navigation in 2-D dynamic environments using an electrostatic potential field , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[6]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[7]  Atsushi Imiya,et al.  Dominant plane detection from optical flow for robot navigation , 2006, Pattern Recognit. Lett..

[8]  Fourth Canadian Conference on Computer and Robot Vision (CRV 2007), 28-30 May 2007, Montreal, Quebec, Canada , 2007, CRV.

[9]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[10]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[11]  Gaurav S. Sukhatme,et al.  A multi-robot approach to stealthy navigation in the presence of an observer , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[12]  Avinash C. Kak,et al.  Vision for Mobile Robot Navigation: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Atsushi Imiya,et al.  Featureless robot navigation using optical flow , 2005, Connect. Sci..

[14]  Danica Kragic,et al.  Artificial potential biased probabilistic roadmap method , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[15]  O. Khatib,et al.  Real-Time Obstacle Avoidance for Manipulators and Mobile Robots , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[16]  Hans-Hellmut Nagel,et al.  An Investigation of Smoothness Constraints for the Estimation of Displacement Vector Fields from Image Sequences , 1983, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.