The capability of robotic systems to deal with uncertainties, biases, and errors automatically is crucial for the tasks defined in an unstructured environment. In this paper, we present a method of automatically reducing uncertainties and calibrating possible biases involved in sensed data and extracted features by a system based on the geometric data fusion. The perception net, as a structural representation of the sensing capabilities of a system, connects features of various levels of abstraction, referred to here as logical sensors, with their functional relationships such as feature transformations, data fusions, and constraints to be satisfied. The net maintains the consistency of logical sensors based on the forward propagation of uncertainties as well as the backward propagation of constraint errors. A novel geometric data fusion algorithm is presented as a unified framework for computing forward and backward propagations through which the net achieves the self-reduction of uncertainties and self-identification of biases. The effectiveness of the proposed method is validated through simulation by applying it to a mobile robot self-localization problem.
[1]
Tom Henderson,et al.
Logical sensor systems
,
1984,
J. Field Robotics.
[2]
Rui J. P. de Figueiredo,et al.
Fusion of radar and optical sensors for space robotic vision
,
1988,
Proceedings. 1988 IEEE International Conference on Robotics and Automation.
[3]
Hugh F. Durrant-Whyte,et al.
Sensor Models and Multisensor Integration
,
1988,
Int. J. Robotics Res..
[4]
Ren C. Luo,et al.
Dynamic multi-sensor data fusion system for intelligent robots
,
1988,
IEEE J. Robotics Autom..
[5]
Hugh F. Durrant-Whyte,et al.
A Bayesian Approach to Optimal Sensor Placement
,
1990,
Int. J. Robotics Res..
[6]
Hans P. Moravec.
Sensor Fusion in Certainty Grids for Mobile Robots
,
1988,
AI Mag..