Kalman-filtering methods for computing information matrices for time-invariant, periodic, and generally time-varying VARMA models and samples

Abstract Under general conditions, the inverse sample information matrix can be used to establish a Cramer-Rao lower bound of the covariance matrix of parameter estimates of a model, and the inverse asymptotic information matrix is the asymptotic covariance matrix of the parameter estimates. The paper does two things. First, it derives a recursive Kalman-filtering method for computing exact sample and asymptotic information matrices for time-invariant, periodic, or generally time-varying Gaussian vector autoregressive moving-average (VARMA) models and samples. Second, it specializes the recursive method to a nonrecursive method for computing exact asymptotic information matrices for time-invariant or periodic VARMA models and samples

[1]  P. Dooren,et al.  Numerical aspects of different Kalman filter implementations , 1986 .

[2]  Gwilym M. Jenkins,et al.  Time series analysis, forecasting and control , 1972 .

[3]  Peter A. Zadrozny Errata to “analytic derivatives for estimation of linear dynamic models” , 1992 .

[4]  Bronwyn H Hall,et al.  Estimation and Inference in Nonlinear Structural Models , 1974 .

[5]  A. Laub A schur method for solving algebraic Riccati equations , 1978, 1978 IEEE Conference on Decision and Control including the 17th Symposium on Adaptive Processes.

[6]  J. Magnus,et al.  Matrix Differential Calculus with Applications in Statistics and Econometrics (Revised Edition) , 1999 .

[7]  Peter A. Zadrozny Analytic Derivatives for Estimation of Linear Dynamic Models , 1988 .

[8]  Peter A. Zadrozny Analytic Derivatives for Estimation of Discrete-Time, , 1988 .

[9]  S. Mittnik Computing Theoretical Autocovariances of Multivariate Autoregressive Moving Average Models by Using a Block Levinson Method , 1993 .

[10]  Gene H. Golub,et al.  Matrix computations , 1983 .

[11]  S. Mittnik Computation of Theoretical Autocovariance Matrices of Multivariate Autoregressive Moving Average Time Series , 1990 .

[12]  P. Caines Linear Stochastic Systems , 1988 .

[13]  Huibert Kwakernaak,et al.  Linear Optimal Control Systems , 1972 .

[14]  R. Todd,et al.  Periodic linear-quadratic methods for modeling seasonality , 1989 .

[15]  Masanobu Taniguchi,et al.  A Central Limit Theorem of Stationary Processes and the Parameter Estimation of Linear Processes (時系列解析の推測 : 理論と応用) , 1981 .

[16]  B. Anderson,et al.  Optimal Filtering , 1979, IEEE Transactions on Systems, Man, and Cybernetics.

[17]  Peter A. Zadrozny Estimating A Multivariate Arma Model with Mixed-Frequency Data: An Application to Forecasting U.S. GNP at Monthly Intervals , 1990 .

[18]  John E. Dennis,et al.  Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.

[19]  Robert Kohn,et al.  Exact likelihood of vector autoregressive-moving average process with missing or aggregated data , 1983 .

[20]  Benjamin Friedlander,et al.  Computation of the exact information matrix of Gaussian time series with stationary random components , 1985, 1985 24th IEEE Conference on Decision and Control.

[21]  Richard H. Jones,et al.  Maximum Likelihood Fitting of ARMA Models to Time Series With Missing Observations , 1980 .

[22]  R. Kohn,et al.  Estimation, Filtering, and Smoothing in State Space Models with Incompletely Specified Initial Conditions , 1985 .

[23]  G. C. Tiao,et al.  Hidden Periodic Autoregressive-Moving Average Models in Time Series Data, , 1980 .

[24]  Jaime Terceiro Lomba Estimation of Dynamic Econometric Models with Errors in Variables , 1990 .