Symbiotic Coevolution of Artificial Neural Networks and Training Data Sets

Among the most important design issues to be addressed to optimize the generalization abilities of trained artificial neural networks (ANNs) are the specific architecture and the composition of the training data set (TDS). Recent work has focused on investigating each of these prerequisites separately. However, some researchers have pointed out the interacting dependencies of ANN topology and the information contained in the TDS. In order to generate coadapted ANNs and TDSs without human intervention we investigate the use of symbiotic (cooperative) coevolution. Independent populations of ANNs and TDSs are evolved by a genetic algorithm (GA), where the fitness of an ANN is equally credited to the TDS it has been trained with. The parallel netGEN system generating generalized multi-layer perceptrons being trained by error-back-propagation has been extended to coevolve TDSs. Empirical results on a simple pattern recognition problem are presented.

[1]  W. Daniel Hillis,et al.  Co-evolving parasites improve simulated evolution as an optimization procedure , 1990 .

[2]  J. Lovelock The Ages of Gaia: A Biography of Our Living Earth , 1988 .

[3]  Felicity A. W. George,et al.  A Study in Set Recombination , 1993, ICGA.

[4]  Lorien Y. Pratt,et al.  Comparing Biases for Minimal Network Construction with Back-Propagation , 1988, NIPS.

[5]  Helio J. C. Barbosa A Coevolutionary Genetic Algorithm for a Game Approach to Structural Optimization , 1997, ICGA.

[6]  Peter M. Todd,et al.  Designing Neural Networks using Genetic Algorithms , 1989, ICGA.

[7]  Helmut A. Mayer ptGAs - Genetic Algorithms Using Promoter/Terminator Sequences - Evolution of Number, Size, and Location of Parameters and Parts of the Representation , 1997 .

[8]  A. Wu Non-coding DNA and floating building blocks for the genetic algorithm , 1996 .

[9]  Reinhold Huber,et al.  On the Role of Regularization Parameters in Fitness Functions for Evolutionary Designed Artificial N , 1996 .

[10]  Roland Schwaiger,et al.  Towards the evolution of training data sets for artificial neural networks , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[11]  James R. Levenick Inserting Introns Improves Genetic Algorithm Success Rate: Taking a Cue from Biology , 1991, ICGA.

[12]  Axel Roebel The Dynamic Pattern Selection Algorithm: Effective Training and Controlled Generalization of Backpropagation Neural Networks , 1994 .

[13]  Reinhold Huber,et al.  netGEN - A Parallel System Generating Problem-Adapted Topologies of Artificial Neural Networks by Means of Genetic Algorithms , 1995 .

[14]  Byoung-Tak Zhang,et al.  Accelerated Learning by Active Example Selection , 1994, Int. J. Neural Syst..