Optimal Anytime Constrained Simulated Annealing for Constrained Global Optimization

In this paper we propose an optimal anytime version of constrained simulated annealing (CSA) for solving constrained nonlinear programming problems (NLPs). One of the goals of the algorithm is to generate feasible solutions of certain prescribed quality using an average time of the same order of magnitude as that spent by the original CSA with an optimal cooling schedule in generating a solution of similar quality. Here, an optimal cooling schedule is one that leads to the shortest average total number of probes when the original CSA with the optimal schedule is run multiple times until it finds a solution. Our second goal is to design an anytime version of CSA that generates gradually improving feasible solutions as more time is spent, eventually finding a constrained global minimum (CGM). In our study, we have observed a monotonically non-decreasing function relating the success probability of obtaining a solution and the average completion time of CSA, and an exponential function relating the objective target that CSA is looking for and the average completion time. Based on these observations, we have designed CSAAT-ID the anytime CSA with iterative deepening that schedules multiple runs of CSA using a set of increasing cooling schedules and a set of improving objective targets. We then prove the optimality of our schedules and demonstrate experimentally the results on four continuous constrained NLPs. CSAAT-ID can be generalized to solving discrete, continuous, and mixed-integer NLPs, since CSA is applicable to solve problems in these three classes. Our approach can also be generalized to other stochastic search algorithms, such as genetic algorithms, and be used to determine the optimal time for each run of such algorithms.

[1]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[2]  Bruce E. Rosen,et al.  Genetic Algorithms and Very Fast Simulated Reannealing: A comparison , 1992 .

[3]  Zbigniew Michalewicz,et al.  Evolutionary Algorithms for Constrained Parameter Optimization Problems , 1996, Evolutionary Computation.

[4]  Tao Wang,et al.  Simulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization , 1999, CP.

[5]  Panos M. Pardalos,et al.  A Collection of Test Problems for Constrained Global Optimization Algorithms , 1990, Lecture Notes in Computer Science.

[6]  M. F. Cardoso,et al.  Nonequilibrium simulated annealing : a faster approach to combinatorial minimization , 1994 .

[7]  Wang You-hua,et al.  Adaptive simulated annealing for the optimal design of electromagnetic devices , 1996 .

[8]  Stephen I. Gallant Simulated Annealing and Boltzmann Machines , 1993 .

[9]  Chris N. Potts,et al.  Single Machine Tardiness Sequencing Heuristics , 1991 .

[10]  Richard E. Korf,et al.  Heuristics as invariants and its application to learning , 1986 .

[11]  Zhe Wu,et al.  The Theory of Discrete Lagrange Multipliers for Nonlinear Discrete Optimization , 1999, CP.

[12]  Emile H. L. Aarts,et al.  A new polynomial time cooling schedule , 1985 .

[13]  J. Bernasconi Low autocorrelation binary sequences : statistical mechanics and configuration space analysis , 1987 .

[14]  Zhe Wu,et al.  The Theory and Applications of Discrete Constrained Optimization using Lagrange Multipliers , 2001 .

[15]  Bruce E. Hajek,et al.  Cooling Schedules for Optimal Annealing , 1988, Math. Oper. Res..

[16]  Emile H. L. Aarts,et al.  Simulated annealing and Boltzmann machines - a stochastic approach to combinatorial optimization and neural computing , 1990, Wiley-Interscience series in discrete mathematics and optimization.

[17]  S. Rees,et al.  Criteria for an optimum simulated annealing schedule for problems of the travelling salesman type , 1987 .