Enhancing es-hyperneat to evolve more complex regular neural networks

The recently-introduced evolvable-substrate HyperNEAT algorithm (ES-HyperNEAT) demonstrated that the placement and density of hidden nodes in an artificial neural network can be determined based on implicit information in an infinite-resolution pattern of weights, thereby avoiding the need to evolve explicit placement. However, ES-HyperNEAT is computationally expensive because it must search the entire hypercube, and was shown only to match the performance of the original HyperNEAT in a simple benchmark problem. Iterated ES-HyperNEAT, introduced in this paper, helps to reduce computational costs by focusing the search on a sequence of two-dimensional cross-sections of the hypercube and therefore makes possible searching the hypercube at a finer resolution. A series of experiments and an analysis of the evolved networks show for the first time that iterated ES-HyperNEAT not only matches but outperforms original HyperNEAT in more complex domains because ES-HyperNEAT can evolve networks with limited connectivity, elaborate on existing network structure, and compensate for movement of information within the hypercube.

[1]  Jimmy Secretan,et al.  Picbreeder: evolving pictures collaboratively online , 2008, CHI.

[2]  Risto Miikkulainen,et al.  A Taxonomy for Artificial Embryogeny , 2003, Artificial Life.

[3]  Peter Strobach,et al.  Quadtree-structured recursive plane decomposition coding of images , 1991, IEEE Trans. Signal Process..

[4]  Risto Miikkulainen,et al.  Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.

[5]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[6]  Josh Bongard,et al.  Evolving modular genetic regulatory networks , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[7]  Risto Miikkulainen,et al.  Competitive Coevolution through Evolutionary Complexification , 2011, J. Artif. Intell. Res..

[8]  Sebastian Risi,et al.  Evolving the placement and density of neurons in the hyperneat substrate , 2010, GECCO '10.

[9]  Jordan B. Pollack,et al.  Creating High-Level Components with a Generative Representation for Body-Brain Evolution , 2002, Artificial Life.

[10]  Kenneth O. Stanley,et al.  Compositional Pattern Producing Networks : A Novel Abstraction of Development , 2007 .

[11]  Charles Ofria,et al.  Investigating whether hyperNEAT produces modular neural networks , 2010, GECCO '10.

[12]  Kenneth O. Stanley,et al.  Autonomous Evolution of Topographic Regularities in Artificial Neural Networks , 2010, Neural Computation.

[13]  Kenneth O. Stanley A Hypercube-Based Indirect Encoding for Evolving Large-Scale Neural Networks , 2009 .

[14]  Kenneth O. Stanley,et al.  A Case Study on the Critical Role of Geometric Regularity in Machine Learning , 2008, AAAI.

[15]  Kenneth O. Stanley,et al.  A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks , 2009, Artificial Life.

[16]  横矢 直和,et al.  Quadtrees and Pyramids for Pattern Recognition and Image Processing , 1981 .

[17]  Charles Ofria,et al.  Evolving coordinated quadruped gaits with the HyperNEAT generative encoding , 2009, 2009 IEEE Congress on Evolutionary Computation.

[18]  Kenneth O. Stanley,et al.  Exploiting Open-Endedness to Solve Problems Through the Search for Novelty , 2008, ALIFE.

[19]  Dario Floreano,et al.  Neuroevolution: from architectures to learning , 2008, Evol. Intell..

[20]  Jon Louis Bentley,et al.  Quad trees a data structure for retrieval on composite keys , 1974, Acta Informatica.