The Reactive Affine Shaker: a Building Block for Minimizing Functions of Continuous Variables

A novel adaptive random search algorithm for the optimization of functions of continuous variables is presented. The scheme does not require any assumptions about the function to be optimized, apart from the availability of evaluations f(x) at selected test points. We assume that the main computational cost lies in the function evaluations and the main design criteria of the RASH scheme consists of the adaptation of a search region by an affine transformation which takes into account the local knowledge derived from trial points generated with a uniform probability. The aim is to scout for local minima in the attraction basin where the initial point falls, by adapting the step size and direction to maintain heuristically the largest possible movement per function evaluation. The design is complemented by the analysis of some strategic choices (like the double-shot strategy and the initialization) and by experimental results showing that, in spite of its simplicity, RASH is a promising building block to consider for the development of more complex optimization algorithms. The developed software is built to facilitate the scientific experimentation and the integration of RASH as a component in more complex schemes.

[1]  F. Glover,et al.  Handbook of Metaheuristics , 2019, International Series in Operations Research & Management Science.

[2]  A. C. Tsoi,et al.  Improved Simulated Annealing Technique , 1988, Proceedings of the 1988 IEEE International Conference on Systems, Man, and Cybernetics.

[3]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[4]  Panos M. Pardalos,et al.  Handbook of applied optimization , 2002 .

[5]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[6]  Bart Selman,et al.  Algorithm portfolios , 2001, Artif. Intell..

[7]  Patrick Siarry,et al.  Tabu Search applied to global optimization , 2000, Eur. J. Oper. Res..

[8]  Thomas Stützle,et al.  Stochastic Local Search: Foundations & Applications , 2004 .

[9]  Pablo Moscato,et al.  Handbook of Applied Optimization , 2000 .

[10]  Roberto Battiti,et al.  Learning with first, second, and no derivatives: A case study in high energy physics , 1994, Neurocomputing.

[11]  Sandro Ridella,et al.  Minimizing multimodal functions of continuous variables with the “simulated annealing” algorithmCorrigenda for this article is available here , 1987, TOMS.

[12]  Robert Hooke,et al.  `` Direct Search'' Solution of Numerical and Statistical Problems , 1961, JACM.

[13]  Roberto Battiti,et al.  The Reactive Tabu Search , 1994, INFORMS J. Comput..

[14]  Patrick Siarry,et al.  Enhanced simulated annealing for globally minimizing functions of many-continuous variables , 1997, TOMS.

[15]  H. Zimmermann Towards global optimization 2: L.C.W. DIXON and G.P. SZEGÖ (eds.) North-Holland, Amsterdam, 1978, viii + 364 pages, US $ 44.50, Dfl. 100,-. , 1979 .

[16]  Roger J.-B. Wets,et al.  Minimization by Random Search Techniques , 1981, Math. Oper. Res..

[17]  Giampietro Tecchiolli,et al.  On random minimization of functions , 2004, Biological Cybernetics.