Dual variable metric algorithms for constrained optimization

We present a class of algorithms for solving constrained optimization problems. In the algorithm non-negatively constrained quadratic programming subproblems are iteratively solved to obtain estimates of Lagrange multipliers and with these estimates a sequence of points which converges to the solution is generated. To achieve a superlinear rate of convergence the matrix appearing in the subproblem is required to be an approximate inverse of the Hessian of the Lagrangian or a penalty Lagrangian. Some well-known variable metric updates such as the BFGS update are employed to generate the matrix and the resulting algorithm converges locally with a superlinear rate.