Unified theory of linear estimation

We consider a general Gauss-Markoff model (Y, Xβ, σ 2 V), where E(Y)=Xβ, D(Y)=σ 2 V. There may be deficiency in R(X), the rank of X and V may be singular. Two unified approaches to the problem of finding BLUE's (minimum variance linear unbiased estimators) have been suggested. One is a direct approach where the problem of inference on the unknown β is reduced to the numerical evaluation of the inverse of a partitioned matrix. The second is an analogue of least squares, where the matrix used in defining the quadratic form in (Y-Xβ) to be minimized is a g-inverse of (V+XUX') in all situations, whether V is nonsingular or not, where U is arbitrary subject to a condition. Complete robustness of BLUE's under different alternatives for V has been examined. A study of BLE's (minimum mean square estimators) without demanding unbiasedness is initiated and a case has been made for further examination. The unified approach is made possible through recent advances in the calculus of generalized inverse of matrices (see the recent book by Rao and Mitra, 1971a).