The subject of this paper is autoregressive (AR) modeling of a stationary, Gaussian discrete time process, based on a finite sequence of observations. The process is assumed to admit an AR(∞) representation with exponentially decaying coefficients. We adopt the nonparametric minimax framework and study how well the process can be approximated by a finiteorder ARmodel. A lower bound on the accuracy of ARapproximations is derived, and a nonasymptotic upper bound on the accuracy of the regularized least squares estimator is established. It is shown that with a “proper” choice of the model order, this estimator is minimax optimal in order. These considerations lead also to a nonasymptotic upper bound on the mean squared error of the associated one-step predictor. A numerical study compares the common model selection procedures to the minimax optimal order choice. 1. Introduction. The standard methods for estimating parameters of time series are based on the assumption that the observations come from an autoregressive (AR), moving average (MA), or mixed (ARMA) model of known orders. This assumption can rarely be justified in practice, and the less stringent assumption is that the time series data are observations from a linear stationary process. A common approach to modeling linear stationary processes is based on an ARapproximation. In this framework a finite order ARmodel is fitted to the observations. The order of the ARmodel should provide an “optimal” finite ARapproximation to the process, and it is usually chosen by selection procedures based on the data. This nonparametric ARapproach to modeling linear stationary processes has been investigated by Shibata (1980), Bhansali (1981, 1986), An, Chen and Hannan (1982) and Hannan and Kavalieris (1986). Shibata (1980) considered the problem of predicting a Gaussian infiniteorder ARprocess by fitting a finite ARmodel. The notion of optimality for the model selection procedure proposed by Shibata (1980) is based on an asymptotic lower bound on the mean squared prediction error. Specifically, the procedure is asymptotically efficient if it attains the lower bound asymptotically. Shibata (1980) also established that the final prediction error (FPE) [Akaike (1970)] and the AIC [Akaike (1974)] criteria are asymptotically efficient in the above sense, provided that the linear process does not degenerate to a finite order autoregression. A similar result has been obtained
[1]
D. Politis,et al.
Statistical Estimation
,
2022
.
[2]
H. Akaike.
Statistical predictor identification
,
1970
.
[3]
K. Berk.
Consistent Autoregressive Spectral Estimates
,
1974
.
[4]
H. Akaike.
A new look at the statistical model identification
,
1974
.
[5]
E. Parzen.
Some recent advances in time series modeling
,
1974
.
[6]
G. Schwarz.
Estimating the Dimension of a Model
,
1978
.
[7]
Steven Kay,et al.
Gaussian Random Processes
,
1978
.
[8]
R. Shibata.
Asymptotically Efficient Selection of the Order of the Model for Estimating Parameters of a Linear Process
,
1980
.
[9]
P. Hall,et al.
Martingale Limit Theory and Its Application
,
1980
.
[10]
R. Shibata.
An Optimal Autoregressive Spectral Estimate
,
1981
.
[11]
R. Bhansali.
Effects of Not Knowing the Order of an Autoregressive Process on the Mean Squared Error of Prediction—I
,
1981
.
[12]
E. Hannan,et al.
Autocorrelation, Autoregression and Autoregressive Approximation
,
1982
.
[13]
Emanuel Parzen,et al.
Autoregressive Spectral Estimation.
,
1983
.
[14]
P. Hall,et al.
Martingale Limit Theory and its Application.
,
1984
.
[15]
Jorma Rissanen,et al.
Universal coding, information, prediction, and estimation
,
1984,
IEEE Trans. Inf. Theory.
[16]
R. C. Bradley.
Basic Properties of Strong Mixing Conditions
,
1985
.
[17]
E. J. Hannan,et al.
REGRESSION, AUTOREGRESSION MODELS
,
1986
.
[18]
R. J. Bhansali,et al.
Asymptotically Efficient Selection of the Order by the Criterion Autoregressive Transfer Function
,
1986
.
[19]
L. Saulis,et al.
Limit theorems for large deviations
,
1991
.
[20]
A. Tsybakov,et al.
Minimax theory of image reconstruction
,
1993
.
[21]
P. Doukhan.
Mixing: Properties and Examples
,
1994
.
[22]
B. Levit,et al.
Asymptotically efficient estimation of analytic functions in Gaussian noise
,
1996
.
[23]
Denis Bosq,et al.
Nonparametric statistics for stochastic processes
,
1996
.
[24]
S. Efromovich.
Data-Driven Efficient Estimation of the Spectral Density
,
1998
.
[25]
Alexander Goldenshluger,et al.
Nonparametric Estimation of Transfer Functions: Rates of Convergence and Adaptation
,
1998,
IEEE Trans. Inf. Theory.