A Portable Ground-Truth System Based on a Laser Sensor

State estimation is of crucial importance to mobile robotics since it determines in a great measure its ability to model the world from noisy observations. In order to quantitatively evaluate state-estimation methods, the availability of ground-truth data is essential since it provides a target that the result of the state-estimation methods should approximate. Most of the reported ground-truth systems require a complex assembly which limit their applicability and make their set-up long and complicated. Furthermore, they often require a long calibration procedure. Additionally, they do not present measures of their accuracy. This paper proposes a portable laser-based ground-truth system. The proposed system can be easily ported from one environment to other and requires almost no calibration. Quantitative results are presented with the purpose of encouraging future comparisons among different groundtruth systems. The presented method has shown to be accurate enough to evaluate state-estimation methods and works in real time.

[1]  Manuela M. Veloso,et al.  SSL-Vision: The Shared Vision System for the RoboCup Small Size League , 2009, RoboCup.

[2]  Yasuhiro Masutani,et al.  Pseudo-local Vision System Using Ceiling Camera for Small Multi-robot Platforms , 2003, RoboCup.

[3]  Christopher K. I. Williams,et al.  Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning) , 2005 .

[4]  Javier Ruiz-del-Solar,et al.  An Automated Refereeing and Analysis Tool for the Four-Legged League , 2006, RoboCup.

[5]  Javier Ruiz-del-Solar,et al.  A Robot Referee for Robot Soccer , 2008, RoboCup.

[6]  Yasuhiro Masutani,et al.  SSL-Humanoid - RoboCup Soccer Using Humanoid Robots under the Global Vision , 2010, RoboCup.

[7]  Raúl Rojas,et al.  Plug and Play: Fast Automatic Geometry and Color Calibration for Cameras Tracking Robots , 2004, RoboCup.

[8]  Geoff A. W. West,et al.  Visual Tracking and Localization of a Small Domestic Robot , 2004, RoboCup.

[9]  Alexander Ferrein,et al.  Providing Ground-Truth Data for the Nao Robot Platform , 2010, RoboCup.

[10]  Hironobu Fujiyoshi,et al.  Mosaic-Based Global Vision System for Small Size Robot League , 2005, RoboCup.

[11]  John Anderson,et al.  Interpolation Methods for Global Vision Systems , 2005, RoboCup.

[12]  Kazuhito Murakami,et al.  Orientation Extraction and Identification of the Opponent Robots in RoboCup Small-Size League , 2006, RoboCup.

[13]  Javier Ruiz-del-Solar,et al.  RoboCup 2010: Robot Soccer World Cup XIV , 2010, Lecture Notes in Computer Science.

[14]  Thomas Röfer,et al.  Fast and Robust Edge-Based Localization in the Sony Four-Legged Robot League , 2003, RoboCup.

[15]  Jacky Baltes,et al.  RoboCup 2009: Robot Soccer World Cup XIII [papers from the 13th annual RoboCup International Symposium, Graz, Austria, June 29 - July 5, 2009] , 2010, RoboCup.