All unconstrained and many constrained optimization problems involve line searches, i.e. minimizing the value of a certain function along a properly chosen direction. There are several methods for performing such one-dimensional optimization but all of them require that the function be unimodal along the search interval. That may force small step sizes and in any case convergence to the closest local optimum. For multimodal functions a line search along any direction is likely to have multiple valleys. We propose using a Genetic Line Search with scalar-coded individuals, convex linear combination crossover and niche formation. Computational experiences show that this approach is more robust with respect to the starting point and that a fewer number of line searches is usually required.
[1]
D. E. Goldberg,et al.
Genetic Algorithms in Search
,
1989
.
[2]
Jorge J. Moré,et al.
Testing Unconstrained Optimization Software
,
1981,
TOMS.
[3]
M. S. Bazaraa,et al.
Nonlinear Programming
,
1979
.
[4]
David E. Goldberg,et al.
Genetic Algorithms in Search Optimization and Machine Learning
,
1988
.
[5]
R. Fletcher.
Practical Methods of Optimization
,
1988
.
[6]
Zbigniew Michalewicz,et al.
Genetic Algorithms + Data Structures = Evolution Programs
,
1996,
Springer Berlin Heidelberg.