Derivative Evaluation and Conditional Random Selection for Accelerating Genetic Algorithms

This paper proposes a new method for accelerating the search speed of genetic algorithms by taking derivative evaluation and conditional random selection into account in their evolution process. Derivative evaluation makes genetic algorithms focus on the individuals whose fitness is rapidly increased. This accelerates the search speed of genetic algorithms by enhancing exploitation like steepest descent methods but also increases the possibility of a premature convergence that means most individuals after a few generations approach to local optima. On the other hand, derivative evaluation under a premature convergence helps genetic algorithms escape the local optima by enhancing exploration. If GAs fall into a premature convergence, random selection is used in order to help escaping local optimum, but its effects are not large. We experimented our method with one combinatorial problem and five complex function optimization problems. Experimental results showed that our method was superior to the simple genetic algorithm especially when the search space is large.

[1]  Won Keun Min,et al.  On Fuzzy M-Sets and Fuzzy M-Continuity , 2005, Int. J. Fuzzy Log. Intell. Syst..

[2]  P. Siarry,et al.  An improvement of the standard genetic algorithm fighting premature convergence in continuous optimization , 2000 .

[3]  Peter Ross,et al.  Adapting Operator Settings in Genetic Algorithms , 1998, Evolutionary Computation.

[4]  Sung Hoon Jung,et al.  Queen-bee evolution for genetic algorithms , 2003 .

[5]  M. C. Sinclair Operator-probability adaptation in a genetic-algorithm/heuristic hybrid for optical network wavelength allocation , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[6]  I. Douglas,et al.  Simple Genetic Algorithm with Local Tuning: Efficient Global Optimizing Technique , 1998 .

[7]  David B. Fogel,et al.  An introduction to simulated evolutionary optimization , 1994, IEEE Trans. Neural Networks.

[8]  C. L. Karr,et al.  Fuzzy control of pH using genetic algorithms , 1993, IEEE Trans. Fuzzy Syst..

[9]  R. Hinterding,et al.  Gaussian mutation and self-adaption for numeric genetic algorithms , 1995, Proceedings of 1995 IEEE International Conference on Evolutionary Computation.

[10]  Zbigniew Michalewicz,et al.  Adaptation in evolutionary computation: a survey , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[11]  Abhijit S. Pandya,et al.  Neural Network Training Using a GMDH Type Algorithm , 2005, Int. J. Fuzzy Log. Intell. Syst..

[12]  Yang Shiyou,et al.  An improved genetic algorithm for global optimization of electromagnetic problems , 2001 .