Perturbation theory for orthogonal projection methods with applications to least squares and total least squares

The stabilized versions of the least squares (LS) and total least squares (TLS) methods are two examples of orthogonal projection methods commonly used to “solve” the overdetermined system of linear equations AX ≈ B when A is nearly rank-deficient. In practice, when this system represents the noisy version of an exact rank-deficient, zero-residual problem, TLS usually yields a more accurate estimate of the exact solution. However, current perturbation theory does not justify the superiority of TLS over LS. In this paper we establish a model for orthogonal projection methods by reformulating the parameter estimation problem as an equivalent problem of nullspace determination. When the method is based on the singular value decomposition of the matrix [A B], the model specializes to the well-known TLS method. We derive new lower and upper perturbation bounds for orthogonal projection methods in terms of the subspace angle, which shows how the perturbation of the approximate nullspace affects the accuracy of the solution. In situations where TLS is typically used, such as in signal processing where the noise-free compatible problem is exactly rank-deficient, our upper bounds suggest that the TLS perturbation bound is usually smaller than the one for LS, which means that TLS is usually more robust than LS under perturbations of all the data. Also, the bounds permit a comparison between the LS and TLS solutions, as well as for any two competing orthogonal projection methods. We include numerical simulations to illustrate our conclusions.

[1]  G. Stewart,et al.  Rank degeneracy and least squares problems , 1976 .

[2]  Ezio Biglieri,et al.  Some properties of singular value decomposition and their applications to digital signal processing , 1989 .

[3]  J. Vandewalle,et al.  Algebraic relationships between classical regression and total least-squares estimation , 1987 .

[4]  P. Wedin Perturbation theory for pseudo-inverses , 1973 .

[5]  Å. Björck Least squares methods , 1990 .

[6]  Musheng Wei,et al.  Algebraic relations between the total least squares and least squares problems with more than one solution , 1992 .

[7]  M. Saunders,et al.  Towards a Generalized Singular Value Decomposition , 1981 .

[8]  Sabine Van Huffel,et al.  Algebraic connections between the least squares and total least squares problems , 1989 .

[9]  Kai-Bor Yu,et al.  Total least squares approach for frequency estimation using linear prediction , 1987, IEEE Trans. Acoust. Speech Signal Process..

[10]  James R. Bunch,et al.  Orthogonal projection and total least squares , 1995, Numer. Linear Algebra Appl..

[11]  G. W. Stewart,et al.  An updating algorithm for subspace tracking , 1992, IEEE Trans. Signal Process..

[12]  P. Hansen The truncatedSVD as a method for regularization , 1987 .

[13]  A. Charafi Handbook of numerical analysis. Volume I, finite difference methods (part 1), solution of equations in Rn (part 1) , 1992 .

[14]  Gene H. Golub,et al.  Matrix computations , 1983 .

[15]  Sabine Van Huffel,et al.  Total least squares problem - computational aspects and analysis , 1991, Frontiers in applied mathematics.

[16]  C. Lawson,et al.  Solving least squares problems , 1976, Classics in applied mathematics.

[17]  A. Sluis,et al.  Restoring rank and consistency by orthogonal projection , 1979 .

[18]  Per Christian Hansen,et al.  Some Applications of the Rank Revealing QR Factorization , 1992, SIAM J. Sci. Comput..

[19]  Torsten Söderström,et al.  On spectral and root forms of sinusoidal frequency estimators , 1991, [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing.