A hybrid approach for vision-based outdoor robot localization using global and local image features

Vision-based robot localization in outdoor environments is difficult because of changing illumination conditions. Another problem is the rough and cluttered environment which makes it hard to use visual features that are not rotation invariant. A popular method that is rotation invariant and relatively robust to changing illumination is the Scale Invariant Feature Transform (SIFT). However, due to the computationally intensive feature extraction and image matching, localization using SIFT is slow. On the other hand, techniques which use global image features are in general less robust and exact than SIFT, but are often much faster due to fast image matching. In this paper, we present a hybrid localization approach that switches between local and global image features. For most images, the hybrid approach uses fast global features. Only in difficult situations, e.g. containing strong illumination changes, the hybrid approach switches to local features. To decide which features to use for an image, we analyze the particle cloud of the particle filter that we use for position estimation. Experiments on outdoor images taken under varying illumination conditions show that the position estimates of the hybrid approach are about as exact as the estimates of SIFT alone. However, the average localization time using the hybrid approach is more than 3.5 times faster than using SIFT.

[1]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[2]  Andreas Zell,et al.  A Novel Approach to Efficient Monte-Carlo Localization in RoboCup , 2006, RoboCup.

[3]  Hashem Tamimi Vision-based features for mobile robot localization , 2006 .

[4]  David G. Lowe,et al.  Local and global localization for mobile robots using visual landmarks , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[5]  Wolfram Burgard,et al.  Probabilistic Robotics (Intelligent Robotics and Autonomous Agents) , 2005 .

[6]  Andreas Zell,et al.  Fast Outdoor Robot Localization Using Integral Invariants , 2007 .

[7]  Horst Bischof,et al.  Mobile robot localization under varying illumination , 2002, Object recognition supported by user interaction for service robots.

[8]  Sven Siggelkow,et al.  Feature histograms for content-based image retrieval , 2002 .

[9]  Wolfram Burgard,et al.  Robust vision-based localization by combining an image-retrieval system with Monte Carlo localization , 2005, IEEE Transactions on Robotics.

[10]  Danijel Skocaj,et al.  A Framework for Robust and Incremental Self-Localization of a Mobile Robot , 2003, ICVS.

[11]  David M. Bradley,et al.  Real-time image-based topological localization in large outdoor environments , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Jana Kosecka,et al.  Vision based topological Markov localization , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[13]  Tim D. Barfoot,et al.  Online visual motion estimation using FastSLAM with SIFT features , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Ales Leonardis,et al.  Mobile robot localization using an incremental eigenspace model , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[15]  A. Zell,et al.  Global Robot Localization using Iterative Scale Invariant Feature Transform , 2005 .