Given n noisy observations g i of the same quantity f, i t i s c o m m o n use to give an estimate of f by minimizing the function P n i=1 (g i ;f) 2. From a statistical point of view this corresponds to computing the Maximum Likelihood estimate, under the assumption of Gaussian noise. Howeve r , i t i s w ell known that this choice leads to results that are very sensitive to the presence of outliers in the data. For this reason it has been proposed to minimize functions of the form P n i=1 V (g i ; f), where V is a function that increases less rapidly than the square. Several choices for V have been proposed and successfully used to obtain \robust" estimates. In this paper we show that, for a class of functions V , using these robust estimators corresponds to assuming that data are corrupted by Gaussian noise whose variance uctuates according to some given probability distribution, that uniquely determines the shape of V .
[1]
S. Bernstein,et al.
Sur les fonctions absolument monotones
,
1929
.
[2]
I. J. Schoenberg.
Metric spaces and completely monotone functions
,
1938
.
[3]
I. S. Gradshteyn,et al.
Table of Integrals, Series, and Products
,
1976
.
[4]
M. C. Jones,et al.
Spline Smoothing and Nonparametric Regression.
,
1989
.
[5]
F. Girosi,et al.
Extensions of a Theory of Networks and Learning: Outliers and Negative Examples
,
1990
.
[6]
J. G. Harris,et al.
Discarding outliers using a nonlinear resistive network
,
1991,
IJCNN-91-Seattle International Joint Conference on Neural Networks.
[7]
Federico Girosi,et al.
Parallel and Deterministic Algorithms from MRFs: Surface Reconstruction
,
1991,
IEEE Trans. Pattern Anal. Mach. Intell..