Simultaneous gradient and evolutionary neural network weights adaptation methods

In this paper, two novel methods, BP/ES and ES/LMS, for simultaneous gradient and evolutionary adaptation of weights of neural network are proposed. In the BP/ES, an evolution strategy is used to optimize the last layer of the multilayer perceptron type of neural network, and back- propagation algorithm trains the rest of the network. The main idea of ES/LMS is to employ the least mean square algorithm to adapt the last layer of network and evolution strategy to optimize the rest of the network. Hybrid approaches to neural network learning, based on gradient and evolutionary techniques combinations, are aimed to raise the advantages of both approaches mentioned above - reliable computational requirements of gradient techniques and global search capabilities of evolutionary approaches. In general, neural network hybrid learning approaches are usually "sequential", rather than simultaneous. In the first step, the evolutionary technique is used to locate a promising region in the search space, and then the gradient technique is employed for fine tuning of network parameters in this region. The proposed BP/ES and ES/LMS methods investigate different approach. They perform "spatial" synthesis of gradient and evolutionary techniques, in which the neural network is partitioned into two parts - output layer versus the other layers - which are adapted simultaneously, however using these two different methods. Experimental results with error back-propagation algorithm, evolution strategies with and without covariances, BP/ES and ES/LMS method on the benchmark "XOR" and "circle in square" data are provided.