A new method for faster neural networks learning introducing functions of synaptic weights

In this paper, a new method for faster neural network learning is proposed. A characteristic of our method is that it lets neural networks have synaptic weight functions (instead of just synaptic weights) in order to improve the sensitivity of the criterion functions with respect to the synaptic weights. By constructing the synaptic weight functions appropriately, the learning process can be significantly improved.

[1]  Li-Qun Xu,et al.  Automatic learning rate optimization by higher-order derivatives , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[2]  Marios M. Polycarpou,et al.  High-order neural network structures for identification of dynamical systems , 1995, IEEE Trans. Neural Networks.

[3]  Tsu-Shuan Chang,et al.  An adaptive step size for backpropagation using linear lower bounding functions , 1995, IEEE Trans. Signal Process..

[4]  Kotaro Hirasawa,et al.  Computing Higher Order Derivatives in Universal Learning Networks , 1998, J. Adv. Comput. Intell. Intell. Informatics.

[5]  Mingui Sun,et al.  An adaptive training algorithm for back-propagation neural networks , 1995, IEEE Trans. Syst. Man Cybern..