Summary
For a general class of scalar stationary processes, essentially those for which the best linear predictor is the best predictor (in the mean square sense), it is shown that, under fairly minor additional conditions, the sample autocorrelations converge to the true values almost surely and hniformly in the lag, t, at a rate (T-1log T)1/2, where T is the sample size. For ARMA processes, if |t|(log T)a, a < ∞, the rate is the best possible, namely (T-1log log T)1/2. In particular the somewhat implausible condition, on the innovations, that E{e(t)2| Ft-l} is constant is avoided in these results. The theorems are used to discuss autoregressive approximation. When the stationary process is a vector process the condition on the innovation sequence, e(t), that E{e(t)e(t)| Ft-l} be constant, cannot be entirely avoided in relation to autoregressive approximation. This is also discussed.
[1]
J. Rissanen,et al.
Modeling By Shortest Data Description*
,
1978,
Autom..
[2]
P. Hartman,et al.
On the Law of the Iterated Logarithm
,
1941
.
[3]
E. Hannan,et al.
Autocorrelation, Autoregression and Autoregressive Approximation
,
1982
.
[4]
E. Hannan.
The Uniform Convergence of Autocovariances
,
1974
.
[5]
E. J. Hannan,et al.
On Limit Theorems for Quadratic Functions of Discrete Time Series
,
1972
.
[6]
H. Akaike.
Fitting autoregressive models for prediction
,
1969
.