Bayesian Sensor Fusion for Cooperative Object Localization and World Modeling

This paper introduces a method for representing, communicating and fusing distributed, noisy and partial observations of an object by multiple robots. This technique describes how to model sensors and the information they acquire. Each sensor is considered as a team member making decisions locally to achieve a local estimate. The local estimates of a robot are then fused with the other robots local estimates to achieve a global fusion estimate of the objects surrounding the team, creating a much more reliable and accurate world model. This method was implemented and tested in RoboCup Middle Size League robots.