Spectral expressions of information measures of Gaussian time series and their relation to AIC and CAT

Interrelations among the spectral expressions of the information measures of Kullback-Leibler (1959) and Renyi (1961) for discrimination between two stationary Gaussian time series are discussed. The spectral expression of Fisher's information rate matrix is also treated, as well as two intuitively acceptable discrimination functions. It is shown that all of them are equivalent except for scalar multiplication and are expressed by Fisher's information rate matrix in the sense of their second-order Taylor series approximation. Finally, a relation between two criteria for order determination of models for time series data, namely, H. Akaike's (1974) information criterion (AIC) and the criterion of autoregressive transfer functions (CAT), is discussed in connection with these spectral expressions. >

[1]  Howell Tong A note on a local equivalence of two recent approaches to autoregressive order determination , 1979 .

[2]  A. N. Kolmogorov,et al.  Interpolation and extrapolation of stationary random sequences. , 1962 .

[3]  J. Shore Minimum cross-entropy spectral analysis , 1981 .

[4]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[5]  J. Burg,et al.  Multisignal minimum-cross-entropy spectrum analysis with weighted initial estimates , 1984 .

[6]  R. Shumway,et al.  Linear Discriminant Functions for Stationary Time Series , 1974 .

[7]  P. Whittle The Analysis of Multiple Stationary Time Series , 1953 .

[8]  H. Akaike A new look at the statistical model identification , 1974 .

[9]  F. Itakura,et al.  A statistical method for estimation of speech spectral density and formant frequencies , 1970 .

[10]  H. T. Davis,et al.  Estimation of the Innovation Variance of a Stationary Time Series , 1968 .

[11]  T. Wada,et al.  Minimum Cross Entropy and Informational Approaches for Spectrum Estimation , 1985 .

[12]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[13]  T. Kailath The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .

[14]  E. Parzen Some recent advances in time series modeling , 1974 .

[15]  R. Shibata Asymptotically Efficient Selection of the Order of the Model for Estimating Parameters of a Linear Process , 1980 .

[16]  P. Papantoni-Kazakos,et al.  Spectral distance measures between Gaussian processes , 1980, ICASSP.

[17]  Demetrios Kazakos,et al.  Spectral distance measures between continuous-time vector Gaussian processes , 1982, IEEE Trans. Inf. Theory.

[18]  R. Bhansali THE CRITERION AUTOREGRESSIVE TRANSFER FUNCTION OF PARZEN , 1986 .

[19]  Robert M. Gray,et al.  On the asymptotic eigenvalue distribution of Toeplitz matrices , 1972, IEEE Trans. Inf. Theory.

[20]  A. Rényi On Measures of Entropy and Information , 1961 .

[21]  R. Shibata An Optimal Autoregressive Spectral Estimate , 1981 .