AR order selection in the case when the model parameters are estimated by forgetting factor least-squares algorithms

During the last decades, the use of information theoretic criteria (ITC) for selecting the order of autoregressive (AR) models has increased constantly. Because the ITC are derived under the strong assumption that the measured signals are stationary, it is not straightforward to employ them in combination with the forgetting factor least-squares algorithms. In the previous literature, the attempts for solving the problem were focused on the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the predictive least squares (PLS). In connection with PLS, an ad hoc criterion called SRM was also introduced. In this paper, we modify the predictive densities criterion (PDC) and the sequentially normalized maximum likelihood (SNML) criterion such that to be compatible with the forgetting factor least-squares algorithms. Additionally, we provide rigorous proofs concerning the asymptotic approximations of four modified ITC, namely PLS, SRM, PDC and SNML. Then, the four criteria are compared by simulations with the modified variants of BIC and AIC.

[1]  Maciej Niedzwiecki,et al.  Identification of Time-Varying Processes , 2000 .

[2]  S. Haykin,et al.  Adaptive Filter Theory , 1986 .

[3]  Satoru Goto,et al.  On-line spectral estimation of nonstationary time series based on AR model parameter estimation and order selection with a forgetting factor , 1995, IEEE Trans. Signal Process..

[4]  C. Z. Wei Adaptive Prediction by Least Squares Predictors in Stochastic Regression Models with Applications to Time Series , 1987 .

[5]  Zhiping Lin,et al.  Recursive adaptive algorithms for fast and rapidly time-varying systems , 2003, IEEE Trans. Circuits Syst. II Express Briefs.

[6]  Maciej Niedzwiecki,et al.  On the localized estimators and generalized Akaike's criterions , 1981, CDC 1981.

[7]  Jorma Rissanen,et al.  Estimation of AR and ARMA models by stochastic complexity , 2006 .

[8]  Jorma Rissanen,et al.  Information and Complexity in Statistical Modeling , 2006, ITW.

[9]  Odile Macchi,et al.  Adaptive recovery of a chirped sinusoid in noise. I. Performance of the RLS algorithm , 1991, IEEE Trans. Signal Process..

[10]  Petar M. Djuric,et al.  Order selection of autoregressive models , 1992, IEEE Trans. Signal Process..

[11]  Mati Wax Order selection for AR models by predictive least-squares , 1986 .

[12]  Odile Macchi,et al.  Adaptive recovery of a chirped sinusoid in noise. II. Performance of the LMS algorithm , 1991, IEEE Trans. Signal Process..

[13]  J. Rissanen,et al.  Conditional NML Universal Models , 2007, 2007 Information Theory and Applications Workshop.

[14]  Richard A. Davis,et al.  Structural Break Estimation for Nonstationary Time Series Models , 2006 .

[15]  P. Grünwald The Minimum Description Length Principle (Adaptive Computation and Machine Learning) , 2007 .

[16]  Seyed Alireza Razavi,et al.  Ar order selection with Information Theoretic Criteria based on localized estimators , 2008, 2008 16th European Signal Processing Conference.

[17]  J. Rissanen,et al.  Modeling By Shortest Data Description* , 1978, Autom..

[18]  H. Akaike A new look at the statistical model identification , 1974 .

[19]  Maciej Niedzwiecki,et al.  Bayesian-like autoregressive spectrum estimation in the case of unknown process order , 1984, The 23rd IEEE Conference on Decision and Control.

[20]  P. Laguna,et al.  Signal Processing , 2002, Yearbook of Medical Informatics.

[21]  J. Raz,et al.  Automatic Statistical Analysis of Bivariate Nonstationary Time Series , 2001 .

[22]  R. Shibata Asymptotically Efficient Selection of the Order of the Model for Estimating Parameters of a Linear Process , 1980 .

[23]  Maciej Niedzwiecki,et al.  On the localized estimators and generalized Akaike's criterions , 1981, 1981 20th IEEE Conference on Decision and Control including the Symposium on Adaptive Processes.

[24]  David D. Falconer,et al.  Tracking properties and steady-state performance of RLS adaptive filter algorithms , 1986, IEEE Trans. Acoust. Speech Signal Process..

[25]  Donald Poskitt ON THE SELECTION OF IRREGULAR, MISSPECIFIED REGRESSION MODELS : A COMMENT ON FOLKLORE , 2008 .

[26]  E. Hannan,et al.  Recursive estimation of autoregressions , 1989 .

[27]  J. Rissanen,et al.  ON SEQUENTIALLY NORMALIZED MAXIMUM LIKELIHOOD MODELS , 2008 .

[28]  Jorma Rissanen,et al.  Order estimation by accumulated prediction errors , 1986 .

[29]  T. Lai,et al.  Least Squares Estimates in Stochastic Regression Models with Applications to Identification and Control of Dynamic Systems , 1982 .

[30]  Tülay Adali,et al.  On the effect of input signal correlation on weight misadjustment in the RLS algorithm , 1995, IEEE Trans. Signal Process..

[31]  B. Friedlander,et al.  Lattice filters for adaptive processing , 1982, Proceedings of the IEEE.

[32]  P. Stoica,et al.  On the expectation of the product of four matrix-valued Gaussian random variables , 1988 .

[33]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[34]  C. Z. Wei On Predictive Least Squares Principles , 1992 .