On a class of parameters estimators in linear models dominating the least squares one

The estimation of parameters in a linear model is considered under the hypothesis that the noise, with finite second order statistics, can be represented by random coefficients in a given deterministic basis. An extended underdetermined design matrix is then formed, and the estimator of the extended parameters with minimum l 1 norm is computed. It is proved that, if the noise variance is larger than a threshold, which depends on the unknown parameters and on the extended design matrix, then the proposed estimator of the original parameters dominates the least-squares estimator, in the sense of the mean square error. A small simulation illustrates its behavior. Moreover it is shown experimentally that it can be convenient, even if the design matrix is not known but only an estimate can be used. Furthermore the noise basis can eventually be used to introduce some prior information in the estimation process. These points are illustrated in a simulation by using the proposed estimator for solving a difficult inverse ill-posed problem, related to the complex moments of an atomic complex measure.