The Gradient Projection Method Along Geodesics
暂无分享,去创建一个
The method of steepest descent for solving unconstrained minimization problems is well understood. It is known, for instance, that when applied to a smooth objective function f, and converging to a solution point x where the corresponding Hessian matrix F(x) is positive definite, the asymptotic rate of convergence is given by the Kantorovich ratio (β − α)2/(β + α)2, where α and β are respectively the smallest and largest eigenvalues of the Hessian matrix F(x). This result is one of the major sharp results on convergence of minimization algorithms. In this paper a corresponding result is given for the gradient projection method for solving constrained minimization problems. It is shown that the asymptotic rate of convergence of gradient projection methods is also given by a Kantorovich ratio, but with α and β being determined by the Lagrangian associated with the problem. Specifically, if L is the Hessian of the Lagrangian evaluated at the solution, α and β are the smallest and largest eigenvalues of L whe...
[1] J. B. Rosen. The Gradient Projection Method for Nonlinear Programming. Part I. Linear Constraints , 1960 .
[2] L. Kantorovich,et al. Functional analysis and applied mathematics , 1963 .
[3] J. Daniel. The Conjugate Gradient Method for Linear and Nonlinear Operator Equations , 1967 .
[4] D. Luenberger. Optimization by Vector Space Methods , 1968 .