Generalizations of the limited-memory BFGS method based on the quasi-product form of update

Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on the quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class of variable metric updates are investigated; three of them use the same number of stored vectors as the limited-memory BFGS method. Moreover, one of the variants does not require any additional matrix by vector multiplication. The second family uses vectors from the preceding iteration to construct a new class of variable metric updates. The resulting methods again require neither any additional matrix by vector multiplication nor any additional stored vector. Global convergence of four of the presented methods is established for convex sufficiently smooth functions. Numerical results indicate that two of the new methods can save computational time substantially for certain problems.