SUMMARY. The present paper considers a family of Stein-rule estimators for estima ting the coefficient vector of a linear regression model with covariance matrix depending on few unknown parameters. Edgeworth type asymptotic expansion for the distribution of Stein-rule estimator is derived and its performance is compared with that of the FGLS estima tor using the criterion of risk under quadratic loss and the criterion of concentration of distri bution around the true value. The covariance matrix of the disturbances in linear regression model is nonscalar and depends upon a few unknown parameters, the regression coefficients can be estimated by a two step procedure which consists of first obtaining the consistent estimates of the parameters involved in the distur bance covariances matrix and then using these estimates to obtain a feasible generalised least squares (FGLS) estimator. Rothenberg (1984) considered the distribution of FGLS estimator and derived the asymptotic expansion for it. When disturbances are homoscedastic, several biased estimators obtained through shrinking the least squares estimator towards the null vector are deve loped (see Judge and Bock, 1978 and Vinod and Ullah, 1981). These esti mators are better than the conventional unbiased least squares estimator in the sense of having lower risk under quadratic loss, and are popularly known as Stein rule estimators. For the comparison of two estimators, apart from the usual mean squared error (MSE) criterion, an alternative criterion based on the probabilities of closeness of the competing estimators to the unknown parameter, termed Pitman Nearness (PN), has also drawn much attention in the recent past. For a discussion of PN and the discrepancies between minimum MSE and PN one may refer to Rao (1981) and Rao et al. (1986). Peddada (1985) estab lished an equivalence between PN, minimum MSE and minimum mean abso lute error under certain conditions. Peddada and Khattre (1986) and Khattre and Peddada (1987) provided relationhship between the PN and variances of the estimators for uni var?ate and multi var?ate cases.
[1]
S. Peddada.
A short note on Pitman's measure of nearness
,
1985
.
[2]
Thomas J. Rothenberg,et al.
APPROXIMATE NORMALITY OF GENERALIZED LEAST SQUARES ESTIMATES
,
1984
.
[3]
A. Ullah,et al.
The sampling distribution of shrinkage estimators and theirF-ratios in the regression model
,
1984
.
[4]
A. Ullah,et al.
Recent Advances in Regression Methods.
,
1983
.
[5]
Calyampudi R. Rao.
Some Comments on the Minimum mean Square Error as a Criterion of Estimation.
,
1980
.
[6]
R. Khattree,et al.
A short note on pitman nearness for elliptically symmetric estimators
,
1987
.
[7]
C. R. Rao,et al.
The pitman nearness criterion and its determination
,
1986
.
[8]
R. Khattree,et al.
On Pitman Nearness and variance of estimators
,
1986
.
[9]
Rand R. Wilcox,et al.
The statistical implications of pre-test and Stein-rule estimators in econometrics
,
1978
.