An activation function adapting training algorithm for sigmoidal feedforward networks

The universal approximation results for sigmoidal feedforward artificial neural networks do not recommend a preferred activation function. In this paper a new activation function adapting algorithm is proposed for sigmoidal feedforward neural network training. The algorithm is compared against the backpropagation algorithm on four function approximation tasks. The results demonstrate that the proposed algorithm can be an order of magnitude faster than the backpropagation algorithm.