A Study of Canonical GAs for NSOPs

In order to solve a Non-Stationary Optimization Problem (NSOP) it is necessary that the used algorithms have a set of suitable properties for being able to dynamically adapt the search to the changing fitness landscape. Our aim in this work is to improve our knowledge of existing canonical algorithms (steady-state, generational, and structured –cellular– genetic algorithms) in such a scenario. We study the behavior of these algorithms in a basic Dynamic Knapsack Problem, and utilize quantitative metrics for analyzing the results. In this work, we analyze the role of the mutation operator in the three algorithms and the impact of the frequency of dynamic changes in the resulting difficulty of the problem. Our conclusions outline that the steady-state GA is the best in fast adapting its search to a new problem definition, while the cellular GA is the best in preserving diversity to finally get accurate solutions. The generational GA is a tradeoff algorithm showing performances in between the other two.

[1]  Hajime Kita,et al.  Adaptation to a Changing Environment by Means of the Feedback Thermodynamical Genetic Algorithm , 1996, PPSN.

[2]  Kenneth de Jong,et al.  The behavior of spatially distributed evolutionary algorithms in non-stationary environments , 1999 .

[3]  S. Tsutsui,et al.  Function optimization in nonstationary environment using steady state genetic algorithms with aging of individuals , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[4]  Karsten Weicker,et al.  Performance Measures for Dynamic Environments , 2002, PPSN.

[5]  Jim Smith,et al.  Replacement Strategies in Steady State Genetic Algorithms: Static Environments , 1998, FOGA.

[6]  Ralf Salomon,et al.  Adaptation on the Evolutionary Time Scale: A Working Hypothesis and Basic Experiments , 1997, Artificial Evolution.

[7]  Enrique Alba,et al.  Cellular Evolutionary Algorithms: Evaluating the Influence of Ratio , 2000, PPSN.

[8]  Hajime Kita,et al.  Adaptation to a Changing Environment by Means of the Thermodynamical Genetic Algorithm , 1999 .

[9]  Zbigniew Michalewicz,et al.  Handbook of Evolutionary Computation , 1997 .

[10]  Karsten Weicker,et al.  An Analysis of Dynamic Severity and Population Size , 2000, PPSN.

[11]  Bernard Manderick,et al.  Fine-Grained Parallel Genetic Algorithms , 1989, ICGA.

[12]  David E. Goldberg,et al.  Nonstationary Function Optimization Using Genetic Algorithms with Dominance and Diploidy , 1987, ICGA.

[13]  Dipankar Dasgupta,et al.  Nonstationary Function Optimization using the Structured Genetic Algorithm , 1992, PPSN.

[14]  Jürgen Branke,et al.  Evolutionary Optimization in Dynamic Environments , 2001, Genetic Algorithms and Evolutionary Computation.

[15]  L. Darrell Whitley,et al.  The GENITOR Algorithm and Selection Pressure: Why Rank-Based Allocation of Reproductive Trials is Best , 1989, ICGA.

[16]  Gilbert Syswerda,et al.  A Study of Reproduction in Generational and Steady State Genetic Algorithms , 1990, FOGA.

[17]  Terence C. Fogarty,et al.  Comparison of steady state and generational genetic algorithms for use in nonstationary environments , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[18]  Emma Hart,et al.  A Comparison of Dominance Mechanisms and Simple Mutation on Non-stationary Problems , 1998, PPSN.

[19]  Enrique Alba,et al.  Improving flexibility and efficiency by adding parallelism to genetic algorithms , 2002, Stat. Comput..

[20]  Juan Julián Merelo Guervós,et al.  Parallel Problem Solving from Nature — PPSN VII , 2002, Lecture Notes in Computer Science.