Unified theory of least squares

Let (Y, Xβ, σ2I) where E(Y)=Xβ and , be the Gauss-Markoff model, where A' denotes the transpose of the matrix A. Further let be astationary point (supposed to exist for all Y) of ; i.e., where its derivative with respect to s is the zero vector. It is shown that if ; is the BLUE of p'β for every , the linear space generated by the columns of X', and an unbiased estimator of σ2 is f=R(G:X)−R(X), where R(V) denotes the rank of V, then it is necessary and sufficient that M is a symmetric g-inverse of ( ) where U is any summarice matrix such that . The method is valid whether G is singular or not and R(X) is full or not. A simple choice of U is always U=k2I, K¬0.