In this paper, two novel methods, BP/ES and ES/LMS, for simultaneous gradient and evolutionary adaptation of weights of neural network are proposed. In the BP/ES, an evolution strategy is used to optimize the last layer of the multilayer perceptron type of neural network, and back- propagation algorithm trains the rest of the network. The main idea of ES/LMS is to employ the least mean square algorithm to adapt the last layer of network and evolution strategy to optimize the rest of the network. Hybrid approaches to neural network learning, based on gradient and evolutionary techniques combinations, are aimed to raise the advantages of both approaches mentioned above - reliable computational requirements of gradient techniques and global search capabilities of evolutionary approaches. In general, neural network hybrid learning approaches are usually "sequential", rather than simultaneous. In the first step, the evolutionary technique is used to locate a promising region in the search space, and then the gradient technique is employed for fine tuning of network parameters in this region. The proposed BP/ES and ES/LMS methods investigate different approach. They perform "spatial" synthesis of gradient and evolutionary techniques, in which the neural network is partitioned into two parts - output layer versus the other layers - which are adapted simultaneously, however using these two different methods. Experimental results with error back-propagation algorithm, evolution strategies with and without covariances, BP/ES and ES/LMS method on the benchmark "XOR" and "circle in square" data are provided.
[1]
Xin Yao,et al.
Fast Evolution Strategies
,
1997,
Evolutionary Programming.
[2]
R. Sun.
An Introduction: On Symbolic Processing in Neural Networks
,
1995
.
[3]
Zbigniew Michalewicz,et al.
Genetic Algorithms + Data Structures = Evolution Programs
,
1996,
Springer Berlin Heidelberg.
[4]
Thomas Bäck,et al.
Evolution Strategies: An Alternative Evolutionary Algorithm
,
1995,
Artificial Evolution.
[5]
Thomas Bäck,et al.
A Survey of Evolution Strategies
,
1991,
ICGA.
[6]
Xin Yao,et al.
Evolutionary Artificial Neural Networks
,
1993,
Int. J. Neural Syst..
[7]
A. Topchy,et al.
Neural network training by means of cooperative evolutionary search
,
1997
.
[8]
Thomas Bäck,et al.
An Overview of Evolutionary Algorithms for Parameter Optimization
,
1993,
Evolutionary Computation.
[9]
Xin Yao,et al.
Evolving artificial neural networks
,
1999,
Proc. IEEE.
[10]
Melanie Mitchell,et al.
An introduction to genetic algorithms
,
1996
.