In fitting partially linear statistical models by least squares, several authors have demonstrated that, for fixed values of the nonlinear parameters, optimum values of the linear parameters can be determined analytically. Thus, the linear parameters can be eliminated by substitution. This reduction in the problem's dimension seems to greatly facilitate its solution by nonlinear least squares algorithms. In many applications, the partially linear model includes a strictly linear portion. In the present article, it is shown that for such models a considerable further reduction can be obtained in the necessary computations. Simultaneously, the earlier results are extended to a much broader class of models and to weighted least squares, and full-rank assumptions are removed. Under certain conditions that are generally satisfied in practice, the reduced model and sum of squares, considered as functions of the nonlinear parameters, have partial derivatives. Analytical expressions for these partials are obtained.
[1]
C. A. Rohde.
Generalized Inverses of Partitioned Matrices
,
1965
.
[2]
E. A. Sylvestre,et al.
Elimination of Linear Parameters in Nonlinear Regression
,
1971
.
[3]
T. Papaioannou,et al.
Parallel tangents and steepest descent optimization algorithm-a computer implementation with application to linear, partially linear models and qualitative data †
,
1972
.
[4]
S. R. Searle.
Linear Models
,
1971
.
[5]
K. S. Banerjee.
Generalized Inverse of Matrices and Its Applications
,
1973
.
[6]
Calyampudi Radhakrishna Rao,et al.
Linear Statistical Inference and its Applications
,
1967
.
[7]
E. Nering,et al.
Linear Algebra and Matrix Theory
,
1964
.