Selection Methods for Evolutionary Algorithms

3.1 Fitness Proportionate Pelection (FPS) 3.2 Windowing 3.3 Sigma Scaling 3.4 Linear Scaling 3.5 Sampling Algorithms 3.6 Ranking 3.7 Linear Ranking 3.8 Exponential Ranking 3.9 Tournament Selection 3.10 Genitor or Steady State Models 3.11 Evolution Strategy and Evolutionary Programming Methods 3.12 Evolution Strategy Approaches 3.13 Top-n Selection 3.14 Evolutionary Programming Methods 3.15 The Effects of Noise Conclusions References Abstract Selection pressure can have a decisive effect on the outcome of an evolutionary search. Try too hard, and you will end up converging prematurely, perhaps on a local maximum, perhaps not even that. Conversely, too little selection pressure, apart from wasting time, may allow the effects of genetic drift to dominate, again leading to a suboptimal result. In nature, there are two aspects to breeding success: surviving long enough to reach reproductive maturity, and then persuading a mate to be your partner. In simulations, such subtleties are mostly the province of artificial life experiments where, for example, an animal that fails to find enough food may die. In such systems it is possible for the whole population to die out, which may be realistic but does rather terminate the search. In most Evolutionary Algorithms (EA), therefore, a more interventionist approach is taken, with reproductive opportunities being allocated on the basis of relative fitness. There are a variety of selection strategies in common use, not all of which use the fitness values directly. Some order the population, and allocate trials by rank, others conduct tournaments, giving something of the flavour of the natural competition for mates. Each of the schools of EA has its own methods of selection, though GA practitioners in particular have experimented with severalSelection pressure can have a decisive effect on the outcome of an evolutionary search. Try too hard, and you will end up converging prematurely, perhaps on a local maximum, perhaps not even that. Conversely, too little selection pressure, apart from wasting time, may allow the effects of genetic drift to dominate, again leading to a suboptimal result. In nature, there are two aspects to breeding success: surviving long enough to reach reproductive maturity, and then persuading a mate to be your partner. In simulations, such subtleties are mostly the province of artificial life experiments where, for example, an animal that fails to find enough food may die. In such systems it is possible for the whole population to die out, which may be realistic but does rather terminate the search. In most Evolutionary Algorithms (EA), therefore, a more interventionist approach is taken, with reproductive opportunities being allocated on the basis of relative fitness. There are a variety of selection strategies in common use, not all of which use the fitness values directly. Some order the population, and allocate trials by rank, others conduct tournaments, giving something of the flavour of the natural competition for mates. Each of the schools of EA has its own methods of selection, though GA practitioners in particular have experimented with several

[1]  Terence C. Fogarty ReproductionRankingReplacement and Noisy Evaluations: Experimental Results , 1993, ICGA.

[2]  L. Darrell Whitley,et al.  The GENITOR Algorithm and Selection Pressure: Why Rank-Based Allocation of Reproductive Trials is Best , 1989, ICGA.

[3]  Thomas Bäck,et al.  Extended Selection Mechanisms in Genetic Algorithms , 1991, ICGA.

[4]  David B. Fogel,et al.  An introduction to simulated evolutionary optimization , 1994, IEEE Trans. Neural Networks.

[5]  Jeffrey L. Elman,et al.  Learning and Evolution in Neural Networks , 1994, Adapt. Behav..

[6]  Gilbert Syswerda,et al.  A Study of Reproduction in Generational and Steady State Genetic Algorithms , 1990, FOGA.

[7]  David E. Goldberg,et al.  Genetic Algorithms with Sharing for Multimodalfunction Optimization , 1987, ICGA.

[8]  Thomas Bäck,et al.  Genetic Algorithms and Evolution Strategies - Similarities and Differences , 1990, PPSN.

[9]  Darrell Whitley,et al.  Genitor: a different genetic algorithm , 1988 .

[10]  James E. Baker,et al.  Reducing Bias and Inefficienry in the Selection Algorithm , 1987, ICGA.

[11]  B. Freisleben,et al.  Optimization of Genetic Algorithms by Genetic Algorithms , 1993 .

[12]  Kalyanmoy Deb,et al.  A Comparative Analysis of Selection Schemes Used in Genetic Algorithms , 1990, FOGA.

[13]  Shu-Yuen Hwang,et al.  A Genetic Algorithm with Disruptive Selection , 1993, ICGA.

[14]  James E. Baker,et al.  Adaptive Selection Methods for Genetic Algorithms , 1985, International Conference on Genetic Algorithms.

[15]  Peter M. Todd,et al.  On the Sympatric Origin of Species: Mercurial Mating in the Quicksilver Model , 1991, ICGA.

[16]  Kenneth A. De Jong,et al.  Generation Gaps Revisited , 1992, FOGA.

[17]  J. David Schaffer,et al.  Proceedings of the third international conference on Genetic algorithms , 1989 .

[18]  K. Dejong,et al.  An analysis of the behavior of a class of genetic adaptive systems , 1975 .

[19]  Lashon B. Booker,et al.  Proceedings of the fourth international conference on Genetic algorithms , 1991 .

[20]  Martina Gorges-Schleuter,et al.  ASPARAGOS An Asynchronous Parallel Genetic Optimization Strategy , 1989, ICGA.

[21]  Larry J. Eshelman,et al.  The CHC Adaptive Search Algorithm: How to Have Safe Search When Engaging in Nontraditional Genetic Recombination , 1990, FOGA.