Self-adaptive learning rates in backpropagation algorithm improve its function approximation performance
暂无分享,去创建一个
[1] V. Tikhomirov. On the Representation of Continuous Functions of Several Variables as Superpositions of Continuous Functions of one Variable and Addition , 1991 .
[2] Robert A. Jacobs,et al. Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.
[3] F. Girosi,et al. Networks for approximation and learning , 1990, Proc. IEEE.
[4] Kurt Hornik,et al. Multilayer feedforward networks are universal approximators , 1989, Neural Networks.
[5] Ramesh C. Jain,et al. A robust backpropagation learning algorithm for function approximation , 1994, IEEE Trans. Neural Networks.
[6] Richard P. Lippmann,et al. An introduction to computing with neural nets , 1987 .
[7] Christopher J. Corbally,et al. The calibration of MK spectral classes using spectral synthesis. 1: The effective temperature calibration of dwarf stars , 1994 .
[8] R. Lippmann,et al. An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.
[9] Ujjwal Bhattacharya,et al. On the rate of convergence of perceptron learning , 1995, Pattern Recognit. Lett..
[10] Robert Hecht-Nielsen,et al. Theory of the backpropagation neural network , 1989, International 1989 Joint Conference on Neural Networks.
[11] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .