Regularized fast recursive least squares algorithms for finite memory filtering

Novel fast recursive least squares algorithms are developed for finite memory filtering, by using a sliding data window. These algorithms allow the use of statistical priors about the solution, and they maintain a balance between a priori and data information. They are well suited for computing a regularized solution, which has better numerical stability properties than the conventional least squares solution. The algorithms have a general matrix formulation, such that the same equations are suitable for the prewindowed as well as the covariance case, regardless of the a priori information used. Only the initialization step and the numerical complexity change through the dimensions of the intervening matrix variables. The lower bound of O(16m) is achieved in the prewindowed case when the estimated coefficients are assumed to be uncorrelated, m being the order of the estimated model. It is shown that a saving of 2m multiplications per recursion can always be obtained. The lower bound of the resulting numerical complexity becomes O(14m), but then the general matrix formulation is lost. >

[1]  Gerald J. Bierman Fixed memory least squares filtering (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[2]  F. Ling,et al.  Efficient time-recursive least-squares algorithms for finite-memory adaptive filtering , 1987 .

[3]  Fuyun Ling,et al.  Numerical accuracy and stability: Two problems of adaptive estimation algorithms caused by round-off error , 1984, ICASSP.

[4]  P. Fabre,et al.  Fast recursive least-squares algorithms: Preventing divergence , 1985, ICASSP '85. IEEE International Conference on Acoustics, Speech, and Signal Processing.

[5]  Thomas Kailath,et al.  Windowed fast transversal filters adaptive algorithms with normalization , 1985, IEEE Trans. Acoust. Speech Signal Process..

[6]  John M. Cioffi,et al.  Limited-precision effects in adaptive filtering , 1987 .

[7]  Nicholas Kalouptsidis,et al.  Fast sequential algorithms for least squares FIR filters with linear phase , 1988 .

[8]  Amrane Houacine Regularized fast recursive least squares algorithms for adaptive filtering , 1991, IEEE Trans. Signal Process..

[9]  Joel Franklin,et al.  Well-posed stochastic extensions of ill-posed linear problems☆ , 1970 .

[10]  George Carayannis,et al.  A unifed view of parametric processing algorithms for prewindowed signals , 1986 .

[11]  George Carayannis,et al.  Fast Kalman type algorithms for sequential signal processing , 1983, ICASSP.

[12]  Houacine,et al.  2 - Approche bayésienne de l'analyse spectrale adaptative: modèles AR longs et filtrage de Kalman rapide , 1987 .

[13]  G. J. Bierman Fixed memory least squares filtering , 1975 .

[14]  Y. Ho,et al.  A Bayesian approach to problems in stochastic estimation and control , 1964 .

[15]  L. Ljung,et al.  Extended Levinson and Chandrasekhar equations for general discrete-time linear estimation problems , 1978 .

[16]  M. Morf,et al.  Some new algorithms for recursive estimation in constant, linear, discrete-time systems , 1974 .

[17]  Amrane Houacine,et al.  Chandrasekhar adaptive regularizer for adaptive filtering , 1986, ICASSP '86. IEEE International Conference on Acoustics, Speech, and Signal Processing.

[18]  L. R. Rabiner,et al.  Performance of a fast algorithm for FIR system identification using least-squares analysis , 1983, The Bell System Technical Journal.

[19]  D. Lin On digital implementation of the fast kalman algorithms , 1984 .

[20]  P. J. Buxbaum Fixed-memory recursive filters (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[21]  T. Kailath,et al.  Fast, recursive-least-squares transversal filters for adaptive filtering , 1984 .

[22]  Lennart Ljung,et al.  Application of Fast Kalman Estimation to Adaptive Equalization , 1978, IEEE Trans. Commun..

[23]  S. Ljung Fast Algorithms for Integral Equations and Least Squares Identification Problems , 1983 .