For minimizing a scalar-valued function, we develop and investigate a family of gradient transformation differential equation algorithms. This family includes, as special cases: steepest descent, Newton's method, Levenberg-Marquardt, and a gradient enhanced Newton algorithm that we develop. Using Rosenbrock's "banana" function we study the stiffness of the gradient transformation family in terms of Lyapunov exponent time histories. For the example function, Newton's method and the Levenberg-Marquardt modification do not yield global asymptotic stability, whereas steepest descent does. However, Newton's method (from an initial point where it does work) is not stiff and is approximately 100 times as fast as steepest descent. In contrast, the gradient enhanced Newton method is globally convergent, is not stiff, and is approximately 25 times faster than Newton's method and approximately 2500 times faster than steepest descent.
[1]
Thomas L. Vincent,et al.
Trajectory Following Methods in Control System Design
,
2002,
J. Glob. Optim..
[2]
T. Vincent,et al.
Nonlinear and Optimal Control Systems
,
1997
.
[3]
B. S. Goh,et al.
Algorithms for Unconstrained Optimization Problems via Control Theory
,
1997
.
[4]
W. Grantham,et al.
A chaotic limit cycle paradox
,
1993
.
[5]
A. Wolf,et al.
Determining Lyapunov exponents from a time series
,
1985
.
[6]
Tosio Kato.
A Short Introduction to Perturbation Theory for Linear Operators
,
1982
.
[7]
T. L. Vincent,et al.
Optimality in parametric systems
,
1981
.
[8]
and Charles K. Taft Reswick,et al.
Introduction to Dynamic Systems
,
1967
.