Including Phenotype Information in Mutation to Evolve Artificial Neural Networks

This paper addresses artificial neural network (ANN) evolution by presenting a mutation approach based on a novel self-adaptation strategy. The proposed strategy involves the phenotype information, incorporated in a value, called the network weight (NW), which depends on a total number of hidden layers and an average number of neurons in hidden layers. The inclusion of the phenotype information determines the increment of the mutation step size and the average percentage of successful mutations, which is achieved by means of adaptation to characteristics and the complexity of ANN architectures. The NW operator is combined with the genotype information, included in the dynamic component and represented by the fitness of a particular chromosome. These two components in the mutation approach drive the evolution of chromosomes according to characteristics of an ANN "internal" architecture and a fitness of a particular chromosome simultaneously.

[1]  W. Vent,et al.  Rechenberg, Ingo, Evolutionsstrategie — Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. 170 S. mit 36 Abb. Frommann‐Holzboog‐Verlag. Stuttgart 1973. Broschiert , 1975 .

[2]  Xin Yao,et al.  A new evolutionary system for evolving artificial neural networks , 1997, IEEE Trans. Neural Networks.

[3]  Xin Yao,et al.  Fast Evolution Strategies , 1997, Evolutionary Programming.

[4]  David B. Fogel,et al.  An Evolutionary Programming Approach to Self-Adaptation on Finite State Machines , 1995, Evolutionary Programming.

[5]  N. Hansen Invariance, Self-adaptation and Correlated Mutations in Evolution Strategies Invariance, Self-adaptation and Correlated Mutations in Evolution Strategies , 2000 .

[6]  Christophe Giraud-Carrier,et al.  Evolutionary Programming of Near-Optimal Neural Networks , 1999, ICANNGA.

[7]  D. Fogel Phenotypes, genotypes, and operators in evolutionary computation , 1995, Proceedings of 1995 IEEE International Conference on Evolutionary Computation.

[8]  Thomas Bäck,et al.  A Survey of Evolution Strategies , 1991, ICGA.

[9]  Zbigniew Michalewicz,et al.  Self-Adaptive Genetic Algorithm for Numeric Functions , 1996, PPSN.

[10]  Peter J. Angeline,et al.  Adaptive and Self-adaptive Evolutionary Computations , 1995 .

[11]  Jiwen Dong,et al.  Time-series forecasting using flexible neural tree model , 2005, Inf. Sci..

[12]  Lawrence Davis,et al.  Training Feedforward Neural Networks Using Genetic Algorithms , 1989, IJCAI.

[13]  Wolfram-Manfred Lippe,et al.  Comparison and Analysis of Mutation-based Evolutionary Algorithms for ANN Parameters Optimization , 2006, DMIN.

[14]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[15]  J. D. Schaffer,et al.  Combinations of genetic algorithms and neural networks: a survey of the state of the art , 1992, [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks.

[16]  Ingo Rechenberg,et al.  Evolutionsstrategie : Optimierung technischer Systeme nach Prinzipien der biologischen Evolution , 1973 .

[17]  Horst Stöcker,et al.  Thermodynamics and Statistical Mechanics , 2002 .

[18]  Xin Yao,et al.  Fast Evolutionary Programming , 1996, Evolutionary Programming.

[19]  Wolfram-Manfred Lippe Soft-Computing: mit Neuronalen Netzen, Fuzzy-Logic und Evolutionären Algorithmen (eXamen.press) , 2007 .

[20]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[21]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[22]  Lalit M. Patnaik,et al.  Learning neural network weights using genetic algorithms-improving performance by search-space reduction , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[23]  D. Fogel,et al.  A comparison of methods for self-adaptation in evolutionary algorithms. , 1995, Bio Systems.