Computing Forward-Difference Intervals for Numerical Optimization

When minimizing a smooth nonlinear function whose derivatives are not available, a popular approach is to use a gradient method with a finite-difference approximation substituted for the exact gradient. In order for such a method to be effective, it must be possible to compute “good” derivative approximations without requiring a large number of function evaluations. Certain “standard” choices for the finite-difference interval may lead to poor derivative approximations for badly scaled problems. We present an algorithm for computing a set of intervals to be used in a forward-difference approximation of the gradient.