Approaches to selection and their effect on fitness modelling in an Estimation of Distribution Algorithm

Selection is one of the defining characteristics of an evolutionary algorithm, yet inherent in the selection process is the loss of some information from a population. Poor solutions may provide information about how to bias the search toward good solutions. Many estimation of distribution algorithms (EDAs) use truncation selection which discards all solutions below a certain fitness, thus losing this information. Our previous work on distribution estimation using Markov networks (DEUM) has described an EDA which constructs a model of the fitness function; a unique feature of this approach is that because selective pressure is built into the model itself selection becomes optional. This paper outlines a series of experiments which make use of this property to examine the effects of selection on the population. We look at the impact of selecting only highly fit solutions, only poor solutions, selecting a mixture of highly fit and poor solutions, and abandoning selection altogether. We show that in some circumstances, particularly where some information about the problem is already known, selection of the fittest only is suboptimal.

[1]  Qingfu Zhang,et al.  An evolutionary algorithm with guided mutation for the maximum clique problem , 2005, IEEE Transactions on Evolutionary Computation.

[2]  David E. Goldberg,et al.  Substructrual surrogates for learning decomposable classification problems: implementation and first results , 2007, GECCO '07.

[3]  Jürgen Branke,et al.  Addressing sampling errors and diversity loss in UMDA , 2007, GECCO '07.

[4]  Thomas Stützle,et al.  SATLIB: An Online Resource for Research on SAT , 2000 .

[5]  Martin Pelikan,et al.  Analyzing probabilistic models in hierarchical BOA on traps and spin glasses , 2007, GECCO '07.

[6]  John A. W. McCall,et al.  Solving the MAXSAT problem using a multivariate EDA based on Markov networks , 2007, GECCO '07.

[7]  David E. Goldberg,et al.  Influence of selection and replacement strategies on linkage learning in BOA , 2007, 2007 IEEE Congress on Evolutionary Computation.

[8]  John A. W. McCall,et al.  Markov Random Field Modelling of Royal Road Genetic Algorithms , 2001, Artificial Evolution.

[9]  David E. Goldberg,et al.  Hierarchical Bayesian Optimization Algorithm , 2006, Scalable Optimization via Probabilistic Modeling.

[10]  Siddhartha Shakya,et al.  DEUM : a framework for an estimation of distribution algorithm based on Markov random fields , 2006 .

[11]  David E. Goldberg,et al.  Evaluation relaxation using substructural information and linear estimation , 2006, GECCO '06.

[12]  Pedro Larrañaga,et al.  Combining Bayesian classifiers and estimation of distribution algorithms for optimization in continuous domains , 2007, Connect. Sci..

[13]  Qingfu Zhang,et al.  Combinations of estimation of distribution algorithms and other techniques , 2007, Int. J. Autom. Comput..

[14]  Pedro Larrañaga,et al.  Evolutionary computation based on Bayesian classifiers , 2004 .

[15]  J. McCall,et al.  Incorporating a Metropolis method in a distribution estimation using Markov random field algorithm , 2005, 2005 IEEE Congress on Evolutionary Computation.

[16]  Roberto Santana,et al.  Estimation of Distribution Algorithms with Kikuchi Approximations , 2005, Evolutionary Computation.

[17]  Hisashi Handa Estimation of Distribution Algorithms with Mutation , 2005, EvoCOP.

[18]  Vojtech Franc,et al.  Estimation of fitness landscape contours in EAs , 2007, GECCO '07.

[19]  C. Reeves,et al.  Properties of fitness functions and search landscapes , 2001 .

[20]  Martin V. Butz,et al.  Substructural Neighborhoods for Local Search in the Bayesian Optimization Algorithm , 2006, PPSN.

[21]  Sébastien Vérel,et al.  Local Search Heuristics: Fitness Cloud versus Fitness Landscape , 2004, ECAI.

[22]  Alberto Ochoa,et al.  Linking Entropy to Estimation of Distribution Algorithms , 2006, Towards a New Evolutionary Computation.

[23]  Pedro Larrañaga,et al.  Combinatonal Optimization by Learning and Simulation of Bayesian Networks , 2000, UAI.

[24]  Heinz Mühlenbein,et al.  Optimal Mutation Rate Using Bayesian Priors for Estimation of Distribution Algorithms , 2001, SAGA.

[25]  Siddhartha Shakya,et al.  Solving the Ising Spin Glass Problem using a Bivariate EDA based on Markov Random Fields , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[26]  Heinz Mühlenbein,et al.  Evolutionary optimization using graphical models , 2009, New Generation Computing.