Simple examples for the failure of Newton’s method with line search for strictly convex minimization

In this paper two simple examples of a twice continuously differentiable strictly convex function $$f$$f are presented for which Newton’s method with line search converges to a point where the gradient of $$f$$f is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function $$f$$f is defined as well as a sequence of descent directions for which exact line searches do not converge to the minimizer of $$f$$f. Then $$f$$f is perturbed such that these search directions coincide with the Newton directions for the perturbed function while leaving the exact line search invariant.