Task-oriented navigation algorithms for an outdoor environment with colored borders and obstacles

This paper presents task-oriented navigation algorithms used for an outdoor environment. The goals of the navigation are recognizing colored border lines on both sides of a path, avoiding obstacles on the path, and navigating the given path. To recognize the colored border lines with one camera, we apply a support vector data description method, which employs six color features extracted from two color models. To avoid collision with obstacles on the path, we fuse the data of the lines measured by a camera and the obstacles measured by a laser range finder. These algorithms were applied to autonomous navigation of about 100 m long curved track. We demonstrate that a four-wheel skid-steering mobile robot successfully finishes the mission.

[1]  Michael N. Shadlen,et al.  Probabilistic reasoning by neurons , 2007, Nature.

[2]  Dariusz Pazderski,et al.  Modeling and control of a 4-wheel skid-steering mobile robot , 2004 .

[3]  Jitendra Malik,et al.  Learning to detect natural image boundaries using local brightness, color, and texture cues , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Robert P. W. Duin,et al.  Support Vector Data Description , 2004, Machine Learning.

[5]  Jaime G. Carbonell,et al.  Machine learning: a guide to current research , 1986 .

[6]  Igor Skrjanc,et al.  Simulation of a Mobile Robot with an LRF in a 2D Environment and Map Building , 2007, RoMoCo.

[7]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[8]  James L. Crowley,et al.  Navigation for an intelligent mobile robot , 1985, IEEE J. Robotics Autom..

[9]  James J. Little,et al.  Vision-based global localization and mapping for mobile robots , 2005, IEEE Transactions on Robotics.

[10]  Sebastian Thrun,et al.  Stanley: The robot that won the DARPA Grand Challenge , 2006, J. Field Robotics.

[11]  I. Škrjanc,et al.  Using a LRF sensor in the Kalman-filtering-based localization of a mobile robot. , 2010, ISA transactions.

[12]  Masahiro Tomono Building an object map for mobile robots using LRF scan matching and vision-based object recognition , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[13]  Libor Preucil,et al.  European Robotics Symposium 2008 , 2008 .

[14]  Byung-Ju Yi,et al.  Motion planning algorithms of an omni-directional mobile robot with active caster wheels , 2011, Intell. Serv. Robotics.

[15]  Carlos Canudas de Wit,et al.  Theory of Robot Control , 1996 .

[16]  In-So Kweon,et al.  Vision-based navigation with efficient scene recognition , 2011, Intell. Serv. Robotics.

[17]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[18]  Bruce A. Draper,et al.  Color recognition in outdoor images , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[19]  Henrik I. Christensen,et al.  Vision SLAM in the Measurement Subspace , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[20]  S. Ishihara,et al.  ON INFINITESIMAL HOLOMORPHICALLY PROJECTIVE TRANSFORMATIONS IN KÄHLERIAN MANIFOLDS , 1960 .