Gain elimination from backpropagation neural networks

It is shown that the gain of the sigmoidal activation function, as used in backpropagation neural networks, can be eliminated since there exists a well-defined relationship between the gain, the learning rate, and the set of initial weights. Similarly, it is also possible to eliminate the learning rate by adjusting the gain and the initial weights. This relationship is proven and extended to various variations of the backpropagation learning rule as well as applied to hardware implementations of neural networks.

[1]  Hong Wang,et al.  How Biased is Your Multi-Layered Perceptron? , 1993 .

[2]  Javier R. Movellan,et al.  Benefits of gain: speeded learning and minimal hidden layers in back-propagation networks , 1991, IEEE Trans. Syst. Man Cybern..

[3]  M. F. Tenorio,et al.  Adaptive gain networks , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[4]  H. John Caulfield,et al.  Weight discretization paradigm for optical neural networks , 1990, Other Conferences.

[5]  L. W. Massengill,et al.  Threshold non-linearity effects on weight-decay tolerance in analog neural networks , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[6]  Scott E. Fahlman,et al.  An empirical study of learning speed in back-propagation networks , 1988 .

[7]  Leon N. Cooper,et al.  Learning and generalization in neural networks , 1990 .

[8]  Antonette M. Logar,et al.  An iterative method for training multilayer networks with threshold functions , 1994, IEEE Trans. Neural Networks.

[9]  Tai-Hoon Cho,et al.  Fast backpropagation learning using steep activation functions and automatic weight reinitialization , 1991, Conference Proceedings 1991 IEEE International Conference on Systems, Man, and Cybernetics.

[10]  Geoffrey E. Hinton,et al.  Experiments on Learning by Back Propagation. , 1986 .

[11]  W. C. Miller,et al.  Training hard-limiting neurons using back-propagation algorithm by updating steepness factors , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[12]  Emile Fiesler,et al.  The Interchangeability of Learning Rate and Gain in Backpropagation Neural Networks , 1996, Neural Computation.

[13]  Emile Fiesler,et al.  Neural network classification and formalization , 1994 .

[14]  Qi Jia,et al.  Equivalence relation between the back propagation learning process of an FNN and that of an FNNG , 1994, Neural Networks.

[15]  Etienne Barnard,et al.  Avoiding false local minima by proper initialization of connections , 1992, IEEE Trans. Neural Networks.

[16]  Martin Brown,et al.  Neurofuzzy adaptive modelling and control , 1994 .

[17]  Indu Saxena,et al.  Adaptive Multilayer Optical Neural Network with Optical Thresholding , 1995 .