Evolving Developmental Programs That Build Neural Networks for Solving Multiple Problems

A developmental model of an artificial neuron is presented. In this model, a pair of neural developmental programs develop an entire artificial neural network of arbitrary size. The pair of neural chromosomes are evolved using Cartesian Genetic Programming. During development, neurons and their connections can move, change, die or be created. We show that this two-chromosome genotype can be evolved to develop into a single neural network from which multiple conventional artificial neural networks can be extracted. The extracted conventional ANNs share some neurons across tasks. We have evaluated the performance of this method on three standard classification problems: cancer, diabetes and the glass datasets. The evolved pair of neuron programs can generate artificial neural networks that perform reasonably well on all three benchmark problems simultaneously. It appears to be the first attempt to solve multiple standard classification problems using a developmental approach.

[1]  Frédéric Gruau,et al.  Automatic Definition of Modular Neural Networks , 1994, Adapt. Behav..

[2]  N. Jakobi Harnessing Morphogenesis Csrp 423 , 1995 .

[3]  Julian Francis Miller,et al.  A Developmental Method for Growing Graphs and Circuits , 2003, ICES.

[4]  Peter J. Bentley,et al.  On growth, form and computers , 2003 .

[5]  R. French Catastrophic forgetting in connectionist networks , 1999, Trends in Cognitive Sciences.

[6]  Risto Miikkulainen,et al.  Efficient evolution of neural network topologies , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[7]  Sebastian Risi,et al.  Indirectly Encoding Neural Plasticity as a Pattern of Local Rules , 2010, SAB.

[8]  Hiroaki Kitano,et al.  Designing Neural Networks Using Genetic Algorithms with Graph Generation System , 1990, Complex Syst..

[9]  Julian Francis Miller,et al.  Neutrality and the Evolvability of Boolean Function Landscape , 2001, EuroGP.

[10]  Jean-Arcady Meyer,et al.  Evolution and development of neural controllers for locomotion, gradient-following, and obstacle-avoidance in artificial insects , 1998, IEEE Trans. Neural Networks.

[11]  Arjen Van Ooyen,et al.  Modeling neural development , 2003 .

[12]  Andy Balaam,et al.  Developmental Neural Networks for Agents , 2003, ECAL.

[13]  Julian Francis Miller,et al.  Recurrent Cartesian Genetic Programming , 2014, PPSN.

[14]  Gul Muhammad Khan,et al.  Where is the brain inside the brain? , 2011, Memetic Comput..

[15]  Gul Muhammad Khan,et al.  Evolution of Cartesian Genetic Programs for Development of Learning Neural Architecture , 2011, Evolutionary Computation.

[16]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[17]  Sebastian Risi,et al.  Evolving the placement and density of neurons in the hyperneat substrate , 2010, GECCO '10.

[18]  Jordan B. Pollack,et al.  Creating High-Level Components with a Generative Representation for Body-Brain Evolution , 2002, Artificial Life.

[19]  T. J. Breen,et al.  Biostatistical Analysis (2nd ed.). , 1986 .

[20]  Christoph Adami,et al.  A Developmental Model for the Evolution of Artificial Neural Networks , 2000, Artificial Life.

[21]  Keith L. Downing,et al.  Supplementing evolutionary developmental systems with abstract models of neurogenesis , 2007, GECCO '07.

[22]  Alistair G. Rust,et al.  Evolutionary neural topiary: growing and sculpting artifical neurons to order , 2000 .

[23]  Jeff Clune,et al.  A novel generative encoding for evolving modular, regular and scalable networks , 2011, GECCO '11.

[24]  Julian Francis Miller,et al.  Redundancy and computational efficiency in Cartesian genetic programming , 2006, IEEE Transactions on Evolutionary Computation.

[25]  Julian Francis Miller,et al.  Developments in Cartesian Genetic Programming: self-modifying CGP , 2010, Genetic Programming and Evolvable Machines.

[26]  Julian Francis Miller,et al.  Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks , 2013, GECCO '13.

[27]  Jordan B. Pollack,et al.  TITLE : Generative Representations for the Automated Design of Modular Physical Robots , 2003 .

[28]  Julian Francis Miller,et al.  The Advantages of Landscape Neutrality in Digital Circuit Evolution , 2000, ICES.

[29]  Sebastian Risi,et al.  Enhancing es-hyperneat to evolve more complex regular neural networks , 2011, GECCO '11.

[30]  Michael McCloskey,et al.  Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .

[31]  Senén Barro,et al.  Do we need hundreds of classifiers to solve real world classification problems? , 2014, J. Mach. Learn. Res..

[32]  Kenneth O. Stanley,et al.  A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks , 2009, Artificial Life.

[33]  John S. McCaskill,et al.  Evolutionary Neural Topiary: Growing and Sculpting Artificial Neurons to Order , 2000 .

[34]  William F. Punch,et al.  Reducing Wasted Evaluations in Cartesian Genetic Programming , 2013, EuroGP.

[35]  Jean-Baptiste Mouret,et al.  Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique , 2014, GECCO.

[36]  R. French,et al.  Catastrophic Forgetting in Connectionist Networks: Causes, Consequences and Solutions , 1994 .

[37]  Peter Eggenberger Creation of Neural Networks Based on Developmental and Evolutionary Principles , 1997 .

[38]  Julian Francis Miller,et al.  Cartesian genetic programming , 2000, GECCO '10.

[39]  Gul Muhammad Khan Evolution of Artificial Neural Development - In Search of Learning Genes , 2018, Studies in Computational Intelligence.

[40]  William F. Punch,et al.  Analysis of Cartesian Genetic Programming’s Evolutionary Mechanisms , 2015, IEEE Transactions on Evolutionary Computation.

[41]  Julian F. Miller,et al.  What bloat? Cartesian Genetic Programming on Boolean problems , 2003 .

[42]  Gul Muhammad Khan,et al.  In search of intelligence: evolving a developmental neuron capable of learning , 2014, Connect. Sci..

[43]  Diego Federici,et al.  A regenerating spiking neural network , 2005, Neural Networks.

[44]  R Ratcliff,et al.  Connectionist models of recognition memory: constraints imposed by learning and forgetting functions. , 1990, Psychological review.

[45]  Larry D. Pyeatt,et al.  A comparison between cellular encoding and direct encoding for genetic neural networks , 1996 .

[46]  A. Cangelosi,et al.  Cell division and migration in a 'genotype' for neural networks (Cell division and migration in neural networks) , 1993 .

[47]  Kenneth O. Stanley,et al.  Compositional Pattern Producing Networks : A Novel Abstraction of Development , 2007 .

[48]  Risto Miikkulainen,et al.  A Taxonomy for Artificial Embryogeny , 2003, Artificial Life.