An information criterion for optimal neural network selection
暂无分享,去创建一个
[1] P. J. Huber. The behavior of maximum likelihood estimates under nonstandard conditions , 1967 .
[2] H. Akaike. A Bayesian extension of the minimum AIC procedure of autoregressive model fitting , 1979 .
[3] J. Rissanen,et al. Modeling By Shortest Data Description* , 1978, Autom..
[4] Solomon Kullback,et al. Information Theory and Statistics , 1960 .
[5] R. Shibata. Selection of the order of an autoregressive model by Akaike's information criterion , 1976 .
[6] D. Fraser. Nonparametric methods in statistics , 1957 .
[7] E. Kamke. Harald Cramér, Random variables and probability distributions. (Cambridge tracts in mathematics and mathematical physics, Nr. 36), Cambridge 1937. 121 S. Preis: broschiert 6s 6d , 1938 .
[8] James P. Reilly,et al. Statistical analysis of the performance of information theoretic criteria in the detection of the number of signals in array processing , 1989, IEEE Trans. Acoust. Speech Signal Process..
[9] G. Schwarz. Estimating the Dimension of a Model , 1978 .
[10] M. Gutierrez,et al. Estimating hidden unit number for two-layer perceptrons , 1989, International 1989 Joint Conference on Neural Networks.
[11] Solomon Kullback,et al. Information Theory and Statistics , 1970, The Mathematical Gazette.
[12] S. Y. Kung,et al. An algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learning , 1988, IEEE 1988 International Conference on Neural Networks.