Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

[1]  Peter Weckesser,et al.  Navigating a mobile service-robot in a natural environment using sensor-fusion techniques , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[2]  F. Ade,et al.  Using the condensation algorithm to implement tracking for mobile robots , 1999, 1999 Third European Workshop on Advanced Mobile Robots (Eurobot'99). Proceedings (Cat. No.99EX355).

[3]  Frédéric Lerasle,et al.  Visual localization of a mobile robot in indoor environments using planar landmarks , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[4]  Ren C. Luo,et al.  A review of high-level multisensor fusion: approaches and applications , 1999, Proceedings. 1999 IEEE/SICE/RSJ. International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI'99 (Cat. No.99TH8480).

[5]  In So Kweon,et al.  Landmark Design and Real-Time Landmark Tracking Using Color Histogram for Mobile Robot Localization , 2001 .

[6]  Won Keun Min On Fuzzy Weak r-minimal Continuity Between Fuzzy Minimal Spaces and Fuzzy Topological Spaces , 2010, Int. J. Fuzzy Log. Intell. Syst..

[7]  Wojciech Pieczynski Unsupervised Dempster-Shafer fusion of dependent sensors , 2000, 4th IEEE Southwest Symposium on Image Analysis and Interpretation.

[8]  John M. Richardson,et al.  Fusion of Multisensor Data , 1988, Int. J. Robotics Res..

[9]  John B. Shoven,et al.  I , Edinburgh Medical and Surgical Journal.

[10]  Baoling Han,et al.  Survey on robot multi-sensor information fusion technology , 2008, 2008 7th World Congress on Intelligent Control and Automation.

[11]  Byoung-Ho Kim Performance Index-Based Evaluation of Quadruped RoboticWalking Configuration , 2010, Int. J. Fuzzy Log. Intell. Syst..

[12]  Ren C. Luo,et al.  Multisensor fusion and integration: approaches, applications, and future research directions , 2002 .

[13]  Clark F. Olson,et al.  Probabilistic self-localization for mobile robots , 2000, IEEE Trans. Robotics Autom..

[14]  Jianwei Zhang,et al.  Multi sensor fusion of camera and 3D laser range finder for object recognition , 2010, 2010 IEEE Conference on Multisensor Fusion and Integration.

[15]  Peter Wall,et al.  Mobile robot navigation using self-similar landmarks , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[16]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[17]  Tae-Seok Jin,et al.  A new approach using sensor data fusion for mobile robot navigation , 2004, Robotica.