A method of training multi-layer networks with heaviside characteristics using internal representations
暂无分享,去创建一个
[1] Pineda,et al. Generalization of back-propagation to recurrent neural networks. , 1987, Physical review letters.
[2] Esther Levin,et al. Accelerated Learning in Layered Neural Networks , 1988, Complex Syst..
[3] Stephen Jose Hanson,et al. Meiosis Networks , 1989, NIPS.
[4] Eduardo D. Sontag. Sigmoids Distinguish More Efficiently Than Heavisides , 1989, Neural Computation.
[5] Peter L. Bartlett,et al. Using random weights to train multilayer networks of hard-limiting units , 1992, IEEE Trans. Neural Networks.
[6] Anders Krogh,et al. A Cost Function for Internal Representations , 1989, NIPS.
[7] Bernard Widrow,et al. Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.
[8] David Saad. Training Recurrent Neural Networks - The Minimal Trajectory Algorithm , 1992, Int. J. Neural Syst..
[9] Ronald J. Williams,et al. A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.