A hybrid of simplex method and simulated annealing

Abstract One of basic concepts of the well-known simplex optimization method is that from the current simplex set of points (solutions) a new point – reflection is constructed. The reflection point is used for a conditional updating of the simplex set. This simple and efficient idea is applied in the simulated annealing to suggest a new version of this stochastic optimization method. As a forerunner of the presented simulated annealing is the controlled random search invented by Price in the middle of seventies. He proposed the very important idea that a population of points is considered and from this population the simplex set is randomly selected. Reflection points update the population so that they conditionally substitute points with highest values of objective function. The simplex simulated annealing enhances further stronger stochastic and evolution character of this method. The construction of reflection points is randomized and their returning to the population is solved by the Metropolis criterion. A parallel version of simplex simulated annealing uses a decomposition of the whole population into disjoint subpopulations for which independent simulated annealings are done. The subpopulations randomly interact so that between two subpopulations their best points are exchanged and worst ones are eliminated.

[1]  Ildiko E. Frank,et al.  Modern nonlinear regression methods , 1995 .

[2]  J. F. Arranz,et al.  Simplex and classical methods for the selection of parameters for the adsorptive stripping voltammetric determination of nitralin. A comparative study , 1994 .

[3]  Yuko Okamoto,et al.  Comparative Study of Multicanonical and Simulated Annealing Algorithms in the Protein Folding Problem , 1994 .

[4]  Janusz S. Kowalik,et al.  Iterative methods for nonlinear optimization problems , 1972 .

[5]  H. H. Rosenbrock,et al.  An Automatic Method for Finding the Greatest or Least Value of a Function , 1960, Comput. J..

[6]  Efstratios N. Pistikopoulos,et al.  Computational studies of stochastic optimization algorithms for process synthesis under uncertainty , 1996 .

[7]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1996, Springer Berlin Heidelberg.

[8]  J. Pospíchal,et al.  Messy simulated annealing , 1995 .

[9]  Emile H. L. Aarts,et al.  Simulated Annealing: Theory and Applications , 1987, Mathematics and Its Applications.

[10]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[11]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[12]  J. Militký,et al.  Multiparametric curve fitting XIV. Modus operandi of the least-squares algorithm MINOPT. , 1993, Talanta.

[13]  Josef Tvrdík,et al.  The controlled random search algorithm in optimizing regression models , 1995 .

[14]  Robert R. Meyer,et al.  Modified Damped Least Squares: An Algorithm for Non-linear Estimation , 1972 .

[15]  Hans-Paul Schwefel,et al.  Numerical Optimization of Computer Models , 1982 .