Vision-based fast and reactive monte-carlo localization

This paper presents a fast approach for vision-based self-localization in RoboCup. The vision system extracts the features required for localization without processing the whole image and is a fist step towards independence of lighting conditions. In the field of self-localization, some new ideas are added to the well-known Monte Carlo localization approach that increase both stability and reactivity, while keeping the processing time low.

[1]  Michael Beetz,et al.  Fast image-based object localization in natural scenes , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Thomas Röfer,et al.  An Architecture for a National RoboCup Team , 2002, RoboCup.

[3]  Wolfram Burgard,et al.  Using the CONDENSATION algorithm for robust, vision-based mobile robot localization , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[4]  Francis K. H. Quek An algorithm for the rapid computation of boundaries of run-length encoded regions , 2000, Pattern Recognit..

[5]  Vahab S. Mirrokni,et al.  A Fast Vision System for Middle Size Robots in RoboCup , 2001, RoboCup.

[6]  Thomas Röfer,et al.  Strategies for Using a Simulation in the Development of the Bremen Autonomous Wheelchair , 1998, ESM.

[7]  Wolfram Burgard,et al.  Monte Carlo Localization with Mixture Proposal Distribution , 2000, AAAI/IAAI.

[8]  Wolfram Burgard,et al.  Monte Carlo Localization: Efficient Position Estimation for Mobile Robots , 1999, AAAI/IAAI.

[9]  Manuela M. Veloso,et al.  Fast and inexpensive color image segmentation for interactive robots , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[10]  Manuela M. Veloso,et al.  Sensor resetting localization for poorly modelled mobile robots , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).