This paper considers the maximum likelihood estimator of the first order moving average process when the true value of the coefficient is one. The results are also extended to regression analysis. It is shown that there is a local maximum of the likelihood function within an interval of O(T -1) of the true value and also that the probability that the maximum occurs exactly at the true value can be calculated in finite samples. IN THIS PAPER, we consider the maximum likelihood estimator (MLE) of the coefficient (p) of a first order moving average (MA(1)) process when the true value of the coefficient is plus or minus one. The results are also extended to regression equations with errors generated by the MA(1) process. Our study has been motivated chiefly by two factors. First, it has been sometimes suggested that the occurrence of a maximum near or at p = 1 could be interpreted as evidence for "over-differencing" [4, 13]. While this does not present any difficulty for exact maximum likelihood estimation procedures [12], we need to consider the properties of the maximum likelihood estimates of p when true p = 1 in order to apply conventional test procedures (e.g. the likelihood ratio test). Second, Kang [9] has shown that the likelihood function is stationary at p = 1 and simulation studies on the behavior of the MLE of p reveal that the global maximum can occur at p = 1 even if the true p is within the invertibility region [9, 5]. There are two main results in this paper. First, we show that there is a local maximum of the likelihood function (p) such that (p - 1) is of O(T 1) but p is not asymptotically normally distributed. Second, it is shown possible to calculate the exact probability that p = 1 is a local maximum of the likelihood function in finite samples. Although we focus only on local maxima near or at p = 1, by constructing arguments similar to those of Jennrich [8] or Sargan [14] it can be shown that the global maximum has probability limit one in large samples so that the properties of these maxima would seem to be of primary interest asymptotically in considering the global MLE.
[1]
Edmund Taylor Whittaker,et al.
A Course of Modern Analysis
,
2021
.
[2]
J. Durbin,et al.
Testing for serial correlation in least squares regression. II.
,
1950,
Biometrika.
[3]
T. W. Anderson,et al.
Asymptotic Theory of Certain "Goodness of Fit" Criteria Based on Stochastic Processes
,
1952
.
[4]
U. Grenander,et al.
Statistical analysis of stationary time series
,
1957
.
[5]
R. Jennrich.
Asymptotic Properties of Non-Linear Least Squares Estimators
,
1969
.
[6]
Gwilym M. Jenkins,et al.
Time series analysis, forecasting and control
,
1972
.
[7]
J. Sargan,et al.
On the theory and application of the general linear model
,
1970
.
[8]
M. Pesaran.
Exact Maximum Likelihood Estimation of a Regression Equation with a First-Order Moving-Average Error
,
1973
.
[9]
I. I. Berenblut,et al.
A New Test for Autocorrelated Errors in the Linear Regression Model
,
1973
.
[10]
G. William Schwert,et al.
Estimation of a non-invertible moving average process: The case of overdifferencing
,
1977
.
[11]
Alok Bhargava,et al.
Testing Residuals from Least Squares Regression for Being Generated by the Gaussian Random Walk
,
1983
.