Nonmonotone conjugate gradient methods for optimization

In this paper conjugate gradient methods with nonmonotone line search technique are introduced. This new line search technique is based on a relaxation of the strong Wolfe conditions and it allows to accept larger steps. The proposed conjugate gradient methods are still globally convergent and, at the same time, they should not suffer the propensity for short steps of some classical conjugate gradient methods. Hence, these new methods should be able to tackle efficiently large scale highly nonlinear (possibly ill-conditioned) problems.