Local Search

At an abstract level, memetic algorithms can be seen as a broad class of populationbased stochastic local search (SLS) methods, where a main theme is that of “exploiting all available knowledge about a problem,” see also Moscato and Cotta [35], page 105. The probably most wide-spread implementation of this theme is that of improving some or all individuals in the population by some local search method. This combination of a population-based, global search and a single-solution local search is a very appealing one. The global search capacity of the evolutionary part of a memetic algorithm takes care of exploration, trying to identify the most promising search space regions; the local search part scrutinizes the surroundings of some initial solution, exploiting it in this way. This idea is not only an appealing one, it is also a practically very successful one. In fact, for a vast majority of combinatorial optimization problems and, as it is also becoming more clear in recent research, also for many continuous optimization problems this combination leads to some of most performing heuristic optimization algorithms. The role of the local search is fundamental and the selection of its search rule and its harmonization within the global search schemes make the global algorithmic success of memetic frameworks. The local search can be integrated within the evolutionary cycle mainly in two ways. The first is the so called “life-time learn-

[1]  Pablo Moscato,et al.  A Gentle Introduction to Memetic Algorithms , 2003, Handbook of Metaheuristics.

[2]  Ferrante Neri,et al.  An Adaptive Multimeme Algorithm for Designing HIV Multidrug Therapies , 2007, IEEE/ACM Transactions on Computational Biology and Bioinformatics.

[3]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[4]  Hector J. Levesque,et al.  A New Method for Solving Hard Satisfiability Problems , 1992, AAAI.

[5]  Francisco Herrera,et al.  Study of the Influence of the Local Search Method in Memetic Algorithms for Large Scale Continuous Optimization Problems , 2009, LION.

[6]  G. W. Stewart,et al.  A Modification of Davidon's Minimization Method to Accept Difference Approximations of Derivatives , 1967, JACM.

[7]  James C. Spall,et al.  Introduction to stochastic search and optimization - estimation, simulation, and control , 2003, Wiley-Interscience series in discrete mathematics and optimization.

[8]  Carlos García-Martínez,et al.  Memetic Algorithms for Continuous Optimisation Based on Local Search Chains , 2010, Evolutionary Computation.

[9]  Francisco J. Solis,et al.  Minimization by Random Search Techniques Author ( s ) : , 2007 .

[10]  Jean-Michel Renders,et al.  Hybridizing genetic algorithms with hill-climbing methods for global optimization: two possible ways , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[11]  William H. Press,et al.  Numerical recipes in C , 2002 .

[12]  Nicholas J. Radcliffe,et al.  The algebra of genetic algorithms , 1994, Annals of Mathematics and Artificial Intelligence.

[13]  Günther R. Raidl,et al.  Finding consensus trees by evolutionary, variable neighborhood search, and hybrid algorithms , 2008, GECCO '08.

[14]  Andy J. Keane,et al.  Meta-Lamarckian learning in memetic algorithms , 2004, IEEE Transactions on Evolutionary Computation.

[15]  J. Basterrechea,et al.  Comparison of Different Heuristic Optimization Methods for Near-Field Antenna Measurements , 2007, IEEE Transactions on Antennas and Propagation.

[16]  Yew-Soon Ong,et al.  Hybrid evolutionary algorithm with Hermite radial basis function interpolants for computationally expensive adjoint solvers , 2008, Comput. Optim. Appl..

[17]  Ya-Xiang Yuan,et al.  On the truncated conjugate gradient method , 2000, Math. Program..

[18]  Thomas Stützle,et al.  Incremental Particle Swarm-Guided Local Search for Continuous Optimization , 2008, Hybrid Metaheuristics.

[19]  M. Powell The NEWUOA software for unconstrained optimization without derivatives , 2006 .

[20]  M. J. D. Powell,et al.  An efficient method for finding the minimum of a function of several variables without calculating derivatives , 1964, Comput. J..

[21]  Kevin Kok Wai Wong,et al.  Classification of adaptive memetic algorithms: a comparative study , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[22]  H. Szu Fast simulated annealing , 1987 .

[23]  Yoel Tenne,et al.  A Memetic Algorithm Using a Trust-Region Derivative-Free Optimization with Quadratic Modelling for Optimization of Expensive and Noisy Black-box Functions , 2007, Evolutionary Computation in Dynamic and Uncertain Environments.

[24]  G. Reinelt The traveling salesman: computational solutions for TSP applications , 1994 .

[25]  Christian L. Müller,et al.  Particle Swarm CMA Evolution Strategy for the optimization of multi-funnel landscapes , 2009, 2009 IEEE Congress on Evolutionary Computation.

[26]  Kok Wai Wong,et al.  Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems , 2005 .

[27]  Hugues Bersini,et al.  A new GA-Local Search Hybrid for Continuous Optimization Based on Multi-Level Single Linkage Clustering , 2000, GECCO.

[28]  Hans-Paul Schwefel,et al.  Evolution and optimum seeking , 1995, Sixth-generation computer technology series.

[29]  Michael W. Trosset,et al.  I Know It When I See It: Toward a Definition of Direct Search Methods , 1996 .

[30]  J. Ford,et al.  Hybrid estimation of distribution algorithm for global optimization , 2004 .