Alternative proofs of the convergence properties of the conjugate-gradient method

For the problem of minimizing an unconstrained function, the conjugate-gradient method is shown to be convergent. If the function is uniformly strictly convex, the ultimate rate of convergence is shown to ben-step superlinear. If the Hessian matrix is Lipschitz continuous, the rate of convergence is shown to ben-step quadratic. All results are obtained for the reset version of the method and with a relaxed requirement on the solution of the stepsize problem. In addition to obtaining sharper results, the paper differs from previously published ones in the mode of proof which contains as a corollary the proof of finiteness of the conjugate-gradient method when applied to a quadratic problem rather than assuming that result.