Genetic Search for Optimal Representations in Neural Networks

An approach to learning in feed-forward neural networks is put forward that combines gradual synaptic modification at the output layer with genetic adaptation in the lower layer(s). In this “GA-delta” technique, the alleles are linear threshold units (a set of weights and a threshold); a chromosome is a collection of such units, and hence defines a mapping from the input layer to a hidden layer. The fitness is evaluated by measuring the error after a small number of delta rule iterations on the hidden-output weights. Genetic operators are defined on these chromosomes to facilitate search for a mapping that renders the task solvable by a single layer of weights. The performance of GA-delta is presented on several tasks, and the effects of the various operators are analyzed.