Improving Robot Self-localization Using Landmarks' Poses Tracking and Odometry Error Estimation

In this article the classical self-localization approach is improved by estimating, independently from the robot's pose, the robot's odometric error and the landmarks' poses. This allows using, in addition to fixed landmarks, dynamic landmarks such as temporally local objects (mobile objects) and spatially local objects (view-dependent objects or textures), for estimating the odometric error, and therefore improving the robot's localization. Moreover, the estimation or tracking of the fixed-landmarks' poses allows the robot to accomplish successfully certain tasks, even when having high uncertainty in its localization estimation (e.g. determining the goal position in a soccer environment without directly seeing the goal and with high localization uncertainty). Furthermore, the estimation of the fixed-landmarks' pose allows having global measures of the robot's localization accuracy, by comparing the real map, given by the real (a priori known) position of the fixed-landmarks, with the estimated map, given by the estimated position of these landmarks. Based on this new approach we propose an improved self-localization system for AIBO robots playing in a RoboCup soccer environment, where the odometric error estimation is implemented using Particle Filters, and the robot's and landmarks' poses are estimated using Extended Kalman Filters. Preliminary results of the system's operation are presented.

[1]  Andrew J. Davison,et al.  A visual compass based on SLAM , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[2]  Simon J. Godsill,et al.  On sequential Monte Carlo sampling methods for Bayesian filtering , 2000, Stat. Comput..

[3]  Sebastian Thrun,et al.  Online simultaneous localization and mapping with detection and tracking of moving objects: theory and results from a ground vehicle in crowded urban areas , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[4]  J. L. Roux An Introduction to the Kalman Filter , 2003 .

[5]  Javier Ruiz-del-Solar,et al.  Spatiotemporal context in robot vision: Detection of static objects in the robocup four legged league , 2018, VISAPP.

[6]  Larry H. Matthies,et al.  Visual odometry on the Mars exploration rovers - a tool to ensure accurate driving and science imaging , 2006, IEEE Robotics & Automation Magazine.

[7]  Roland Siegwart,et al.  Introduction to Autonomous Mobile Robots , 2004 .

[8]  Kurt Konolige,et al.  Frame-Frame Matching for Realtime Consistent Visual Mapping , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[9]  Larry H. Matthies,et al.  Visual odometry on the Mars Exploration Rovers , 2005, 2005 IEEE International Conference on Systems, Man and Cybernetics.

[10]  T. Başar,et al.  A New Approach to Linear Filtering and Prediction Problems , 2001 .

[11]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.