Network information criterion-determining the number of hidden units for an artificial neural network model
暂无分享,去创建一个
Shun-ichi Amari | Noboru Murata | Shuji Yoshizawa | S. Amari | N. Murata | S. Yoshizawa | Noboru Murata
[1] Shun-ichi Amari,et al. A Theory of Adaptive Pattern Classifiers , 1967, IEEE Trans. Electron. Comput..
[2] S. Amari. A Theory ofAdaptive Pattern Classifiers , 1967 .
[3] H. Akaike. A new look at the statistical model identification , 1974 .
[4] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[5] J. Rissanen. Stochastic Complexity and Modeling , 1986 .
[6] Grace Wahba,et al. THREE TOPICS IN ILL-POSED PROBLEMS , 1987 .
[7] Halbert White,et al. Learning in Artificial Neural Networks: A Statistical Perspective , 1989, Neural Computation.
[8] D. B. Fogel,et al. AN INFORMATION CRITERION FOR OPTIMAL NEURAL NETWORK SELECTION , 1990, 1990 Conference Record Twenty-Fourth Asilomar Conference on Signals, Systems and Computers, 1990..
[9] M. Kawato,et al. Estimation of generalization capability by combination of new information criterion and cross validation , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.
[10] John E. Moody,et al. The Effective Number of Parameters: An Analysis of Generalization and Regularization in Nonlinear Learning Systems , 1991, NIPS.
[11] Shun-ichi Amari,et al. Learning Curves, Model Selection and Complexity of Neural Networks , 1992, NIPS.
[12] Shun-ichi Amari,et al. Statistical Theory of Learning Curves under Entropic Loss Criterion , 1993, Neural Computation.