Elimination of Linear Parameters in Nonlinear Regression
暂无分享,去创建一个
Here 0E , *. -, 0p represent the p parameters of the model. The advent of the high-speed digital computer has made the use of any one of a number of iterative algorithms practical. The Gauss-Newton, the modified Newtons method due to Hartley, steepest decent, and the Marquardt compromise are, perhaps, the best known. These methods are discussed in detail in the references [1], [2], [3] and [4]. In each of these iterative procedures, one must first come up with a starting guess for the entire vector of parameters (0I , 02 , * * * , 0O). A correction vector is then derived and applied to this initial guess in order to produce an "improved" estimate of the parameter vector. This process is continued until the correction vector becomes sufficiently small. Under a suitable set of conditions, one can show that this series of corrected estimates will converge to the least squares estimates of the p parameters. That is, the parameter vector will converge to (01 , 02, ... , ,p), the vector of values which minimizes
[1] D. M. Ellis,et al. Applied Regression Analysis , 1968 .
[2] D. Marquardt. An Algorithm for Least-Squares Estimation of Nonlinear Parameters , 1963 .