Constrained genetic algorithms and their applications in nonlinear constrained optimization

The paper presents a problem-independent framework that unifies various mechanisms for solving discrete constrained nonlinear programming (NLP) problems whose functions are not necessarily differentiable and continuous. The framework is based on the first-order necessary and sufficient conditions in the theory of discrete constrained optimization using Lagrange multipliers. It implements the search for discrete-neighborhood saddle points (SP/sub dn/) by performing ascents in the original-variable subspace and descents in the Lagrange-multiplier subspace. Our study on the various mechanisms shows that CSAGA, a combined constrained simulated annealing and genetic algorithm, performs well. Finally, we apply iterative deepening to determine the optimal number of generations in CSAGA.

[1]  A. E. Eiben,et al.  Self-adaptivity for constraint satisfaction: learning penalty functions , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[2]  Stephen I. Gallant Simulated Annealing and Boltzmann Machines , 1993 .

[3]  Dimitri P. Bertsekas,et al.  Constrained Optimization and Lagrange Multiplier Methods , 1982 .

[4]  Panos M. Pardalos,et al.  A Collection of Test Problems for Constrained Global Optimization Algorithms , 1990, Lecture Notes in Computer Science.

[5]  Zbigniew Michalewicz,et al.  Evolutionary Algorithms, Homomorphous Mappings, and Constrained Parameter Optimization , 1999, Evolutionary Computation.

[6]  Z. Michalewicz,et al.  Genocop III: a co-evolutionary algorithm for numerical optimization problems with nonlinear constraints , 1995, Proceedings of 1995 IEEE International Conference on Evolutionary Computation.

[7]  Zhe Wu,et al.  The Theory of Discrete Lagrange Multipliers for Nonlinear Discrete Optimization , 1999, CP.

[8]  R. Korf An Optimal Admissible Tree Search , 1985 .

[9]  Sandro Ridella,et al.  Minimizing multimodal functions of continuous variables with the “simulated annealing” algorithmCorrigenda for this article is available here , 1987, TOMS.

[10]  Zbigniew Michalewicz,et al.  Evolutionary Algorithms for Constrained Parameter Optimization Problems , 1996, Evolutionary Computation.

[11]  Tao Wang,et al.  Simulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization , 1999, CP.

[12]  Christopher R. Houck,et al.  On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with GA's , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[13]  Yixin Chen,et al.  Optimal Anytime Constrained Simulated Annealing for Constrained Global Optimization , 2000, CP.