Variation of activation functions for accelerating the learning speed of the multilayer neural network

In this raper, an enhanced learning method is proposed for improving the learning speed of the error back propagation learning algorithm. In order to cope with the premature saturation phenomenon at the initial learning stage, a variation scheme of active functions is introduced by using higher order functions, which does not need much increase of computation load. It naturally changes the learning rate of inter-connection weights to a large value as the derivative of sigmoid function abnormally decrease to a small value during the learning epoch. Also, we suggest the hybrid learning method incorporated the proposed method with the momentum training algorithm. Computer simulation results show that the proposed learning algorithm outperforms the conventional methods such as momentum and delta-bar-delta algorithms.