Appearance and map-based global localization using laser reflectivity

Global localization is a fundamental ability to recognize the accurate global position for a mobile robot in a revisited environment. The map-based global localization gives a precise position by calculating an accurate transformation, but the comparison with large 3D data is quite time-consuming. The appearance-based global localization which determines the global position by image retrieval techniques with similar structures is real-time. However, this technique needs external illumination constraint and does not work in the dark extremely. This paper proposes a combination of the map-based global localization and the appearance-based global localization. Instead of camera images used for the appearance-based global localization, we utilize reflectance images which are taken as a byproduct of range sensing by a laser range finder. The proposed method not only detects previously visited scenes but also estimate relative poses precisely. The effectiveness of the proposed technique is demonstrated through experiments in real environments.

[1]  Shin'ichi Yuta,et al.  Autonomous navigation for mobile robots referring pre-recorded image sequence , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[2]  Wolfram Burgard,et al.  Robust Monte Carlo localization for mobile robots , 2001, Artif. Intell..

[3]  Sumetee kesorn Visual Navigation for Mobile Robots: a Survey , 2012 .

[4]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[5]  Masayuki Inaba,et al.  Visual navigation using view-sequenced route representation , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[6]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  James L. Crowley,et al.  Appearance based processes for visual navigation , 1997 .

[8]  H. Yoshitaka,et al.  Mobile Robot Localization and Mapping by Scan Matching using Laser Reflection Intensity of the SOKUIKI Sensor , 2006, IECON 2006 - 32nd Annual Conference on IEEE Industrial Electronics.

[9]  James L. Crowley,et al.  Appearance based process for visual navigation , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[10]  Sebastian Thrun,et al.  Probabilistic Algorithms in Robotics , 2000, AI Mag..

[11]  Ryo Kurazume,et al.  An Experimental Study of a Cooperative Positioning System , 2000, Auton. Robots.

[12]  Yi-Ping Hung,et al.  RANSAC-Based DARCES: A New Approach to Fast Automatic Registration of Partially Overlapping Range Images , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Paul Newman,et al.  FAB-MAP: Probabilistic Localization and Mapping in the Space of Appearance , 2008, Int. J. Robotics Res..

[14]  Hiroshi Masuda,et al.  HELIOS system: A team of tracked robots for special urban search and rescue operations , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Jean-Arcady Meyer,et al.  Map-based navigation in mobile robots: I. A review of localization strategies , 2003, Cognitive Systems Research.

[16]  Sebastian Thrun,et al.  A Probabilistic On-Line Mapping Algorithm for Teams of Mobile Robots , 2001, Int. J. Robotics Res..

[17]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[18]  Jean-Arcady Meyer,et al.  Fast and Incremental Method for Loop-Closure Detection Using Bags of Visual Words , 2008, IEEE Transactions on Robotics.