On the convergence of sequential minimization algorithms

This note discusses the conditions for convergence of algorithms for finding the minimum of a function of several variables which are based on solving a sequence of one-variable minimization problems. Theorems are given which contrast the weakest conditions for convergence of gradient-related algorithms with those for more general algorithms, including those which minimize in turn along a sequence of uniformly linearly independent search directions.