Iteratively Reweighted Least Squares: Algorithms, Convergence Analysis, and Numerical Comparisons
暂无分享,去创建一个
In solving robust linear regression problems, the parameter vector x, as well as an additional parameter s that scales the residuals, must be estimated simultaneously. A widely used method for doing so consists of first improving the scale parameter s for fixed x, and then improving x for fixed s by using a quadratic approximation to the objective function g. Since improving x is the expensive part of such algorithms, it makes sense to define the new scale s as a minimizes of g for fixed x. A strong global convergence analysis of this conceptual algorithm is given for a class of convex criterion functions and the so-called H- or W-approximations to g. Moreover, some appropriate finite and iterative subalgorithms for minimizing g with respect to s are discussed. Furthermore, the possibility of transforming the robust regression problem into a nonlinear least-squares problem is discussed. All algorithms described here were tested with a set of test problems, and the computational efficiency was compared wit...
[1] P. Holland,et al. Robust regression using iteratively reweighted least-squares , 1977 .