On the accelerating property of an algorithm for function minimization without calculating derivatives

This paper studies the speed of convergence of a general algorithm for function minimization without calculating derivatives. This algorithm contains Powell's 1964 algorithm as well as Zangwill's second modification of this procedure. The main results are Theorems 3.1 and 4.1 which show that, if the algorithm behaves well, then asymptotically almost conjugate directions are built; therefore, the algorithm has an every-iteration superlinear speed of convergence. The paper hinges on ideas of McCormick and Ritter and Powell.