A new dynamical evolutionary algorithm based on statistical mechanics

In this paper, a new dynamical evolutionary algorithm (DEA) is presented based on the theory of statistical mechanics. The novelty of this kind of dynamical evolutionary algorithm is that all individuals in a population (called particles in a dynamical system) are running and searching with their population evolving driven by a new selecting mechanism. This mechanism simulates the principle of molecular dynamics, which is easy to design and implement. A basic theoretical analysis for the dynamical evolutionary algorithm is given and as a consequence two stopping criteria of the algorithm are derived from the principle of energy minimization and the law of entropy increasing. In order to verify the effectiveness of the scheme, DEA is applied to solving some typical numerical function minimization problems which are poorly solved by traditional evolutionary algorithms. The experimental results show that DEA is fast and reliable.

[1]  W. Bullough Darwin's Finches , 1947, Nature.

[2]  Franz Rothlauf,et al.  On the importance of the second largest eigenvalue on the convergence rate of genetic algorithms , 2001 .

[3]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1992, Artificial Intelligence.

[4]  Yuhui Shi,et al.  Particle swarm optimization: developments, applications and resources , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[5]  J. McCauley Classical Mechanics: Transformations, Flows, Integrable and Chaotic Dynamics , 1997 .

[6]  Lishan Kang,et al.  On the Convergence Rates of Genetic Algorithms , 1999, Theor. Comput. Sci..

[7]  Günter Rudolph,et al.  Theory of Evolutionary Algorithms: A Bird's Eye View , 1999, Theor. Comput. Sci..

[8]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[9]  John H. Holland,et al.  Hidden Order: How Adaptation Builds Complexity , 1995 .

[10]  Zbigniew Michalewicz,et al.  Handbook of Evolutionary Computation , 1997 .

[11]  Ehl Emile Aarts,et al.  Simulated annealing and Boltzmann machines , 2003 .

[12]  Peter Nordin,et al.  Genetic programming - An Introduction: On the Automatic Evolution of Computer Programs and Its Applications , 1998 .

[13]  Dorothea Heiss-Czedik,et al.  An Introduction to Genetic Algorithms. , 1997, Artificial Life.

[14]  Kalyanmoy Deb,et al.  An Investigation of Niche and Species Formation in Genetic Function Optimization , 1989, ICGA.

[15]  Melanie Mitchell,et al.  The royal road for genetic algorithms: Fitness landscapes and GA performance , 1991 .

[16]  James Demmel,et al.  Applied Numerical Linear Algebra , 1997 .

[17]  John H. Holland,et al.  Building Blocks, Cohort Genetic Algorithms, and Hyperplane-Defined Functions , 2000, Evolutionary Computation.

[18]  John R. Koza,et al.  Hidden Order: How Adaptation Builds Complexity. , 1995, Artificial Life.

[19]  Emile H. L. Aarts,et al.  Simulated annealing and Boltzmann machines - a stochastic approach to combinatorial optimization and neural computing , 1990, Wiley-Interscience series in discrete mathematics and optimization.

[20]  M. Nice,et al.  Darwin's Finches , 1947 .

[21]  Wolfgang Banzhaf,et al.  Genetic Programming: An Introduction , 1997 .