Constructing First Order Stationary Autoregressive Models via Latent Processes

First order stationary autoregressive (AR(1)) models are introduced for which there exists a linear relation between the expectations of the observations, and where it is readily possible to arrange the marginal distributions to be other than normal.

[1]  Adrian F. M. Smith,et al.  Bayesian computation via the gibbs sampler and related markov chain monte carlo methods (with discus , 1993 .

[2]  T. Ferguson A Bayesian Analysis of Some Nonparametric Problems , 1973 .

[3]  O. Barndorff-Nielsen Normal Inverse Gaussian Distributions and Stochastic Volatility Modelling , 1997 .

[4]  B. Jørgensen,et al.  Some properties of exponential dispersion models , 1986 .

[5]  P. Jacobs,et al.  A mixed autoregressive-moving average exponential sequence and point process (EARMA 1,1) , 1977, Advances in Applied Probability.

[6]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1951 .

[7]  A. J. Lawrance,et al.  An exponential moving-average sequence and point process (EMA1) , 1977, Journal of Applied Probability.

[8]  Stephen G Walker,et al.  A Note on the Innovation Distribution of a Gamma Distributed Autoregressive Process , 2000 .

[9]  Peter X.-K. Song,et al.  Stationary time series models with exponential dispersion model margins , 1998 .

[10]  D. P. Gaver,et al.  First-order autoregressive gamma sequences and point processes , 1980, Advances in Applied Probability.

[11]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[12]  Harry Joe TIME SERIES MODELS WITH UNIVARIATE MARGINS IN THE CONVOLUTION-CLOSED INFINITELY DIVISIBLE CLASS , 1996 .

[13]  Lain L. MacDonald,et al.  Hidden Markov and Other Models for Discrete- valued Time Series , 1997 .