Mutation Operators for Structure Evolutionof Neural Networks

The architecture of a neural network can be optimized by structure evolution. The structure evolution is based upon a two-stage evolution strategy (multipopulation strategy): On the population level, the structure is optimized, on the individual level, the parameters are adapted. For the variation of the (discrete) architecture, a mutation operator must be defined. To attain successful optimization, a mutation operator must satisfy two main conditions in the space of structures: First the principle of strong causality must be obeyed (smoothness of the fitness landscape in the space of structure), and second, a transition path between the structures must be guaranteed. In this paper different heuristic mutation operators will be defined and examined on their behavior with respect to strong causality and to neighborhood relation in the space of structures.

[1]  D. B. Fogel,et al.  Using evolutionary programing to create neural networks that are capable of playing tic-tac-toe , 1993, IEEE International Conference on Neural Networks.

[2]  A. W. F. EDWARDS,et al.  Evolution and optimization , 1987, Nature.

[3]  Donald E. Waagen,et al.  Evolving recurrent perceptrons for time-series modeling , 1994, IEEE Trans. Neural Networks.

[4]  Frédéric Gruau,et al.  Genetic Synthesis of Modular Neural Networks , 1993, ICGA.

[5]  Vittorio Maniezzo,et al.  Genetic evolution of the topology and weight distribution of neural networks , 1994, IEEE Trans. Neural Networks.

[6]  W. Vent,et al.  Rechenberg, Ingo, Evolutionsstrategie — Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. 170 S. mit 36 Abb. Frommann‐Holzboog‐Verlag. Stuttgart 1973. Broschiert , 1975 .

[7]  Lawrence J. Fogel,et al.  Artificial Intelligence through Simulated Evolution , 1966 .

[8]  David B. Fogel,et al.  An introduction to simulated evolutionary optimization , 1994, IEEE Trans. Neural Networks.

[9]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[10]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[11]  Peter M. Todd,et al.  Designing Neural Networks using Genetic Algorithms , 1989, ICGA.

[12]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[13]  Joachim Born,et al.  Designing Neural Networks by Adaptively Building Blocks in Cascades , 1994, PPSN.

[14]  Ingo Rechenberg,et al.  Evolutionsstrategie : Optimierung technischer Systeme nach Prinzipien der biologischen Evolution , 1973 .

[15]  H. P. Schwefel,et al.  Numerische Optimierung von Computermodellen mittels der Evo-lutionsstrategie , 1977 .

[16]  J. R. McDonnell,et al.  Evolving neural network connectivity , 1993, IEEE International Conference on Neural Networks.

[17]  Peter J. Angeline,et al.  An evolutionary algorithm that constructs recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[18]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[19]  Christian Jacob,et al.  Evolution of neural net architectures by a hierarchical grammar-based genetic system , 1993 .

[20]  L. Darrell Whitley,et al.  Adding Learning to the Cellular Development of Neural Networks: Evolution and the Baldwin Effect , 1993, Evolutionary Computation.

[21]  John R. Koza,et al.  Genetic generation of both the weights and architecture for a neural network , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.