Rate of convergence of several conjugate gradient algorithms.

Conjugate gradient algorithms are used to minimize nonlinear, nonquadratic, real-valued functions on $\mathbb{R}^n $. Rates of convergence are found for several of these algorithms where the conjugate variable is reinitialized every r steps (where r is greater than or equal to n). It is shown in a neighborhood of the minimum that the error, when starting from a point of reinitialization, decreases by order 2 after n steps.