The Gradient Projection Method Along Geodesics

The method of steepest descent for solving unconstrained minimization problems is well understood. It is known, for instance, that when applied to a smooth objective function f, and converging to a solution point x where the corresponding Hessian matrix F(x) is positive definite, the asymptotic rate of convergence is given by the Kantorovich ratio (β − α)2/(β + α)2, where α and β are respectively the smallest and largest eigenvalues of the Hessian matrix F(x). This result is one of the major sharp results on convergence of minimization algorithms. In this paper a corresponding result is given for the gradient projection method for solving constrained minimization problems. It is shown that the asymptotic rate of convergence of gradient projection methods is also given by a Kantorovich ratio, but with α and β being determined by the Lagrangian associated with the problem. Specifically, if L is the Hessian of the Lagrangian evaluated at the solution, α and β are the smallest and largest eigenvalues of L whe...