AN INFORMATION CRITERION FOR OPTIMAL NEURAL NETWORK SELECTION

Neural networks have been used to resolve a variety of classific h o n problems. The compnta.tiona1 properties of inatiy of the possible network designs have lieen aiial!.zed. but t l io decision as to which of several competing Ilet\voi.li acliitecturc5 is “best” for a given problem remains sulijcctive. :\ rclatioliship between optimal network design and statistical iiiodel identification is described. A derivative of .Ali?iilie’s iiiformation criterion (AIC) is given. This modification yields a i i information stat,istic which caii bc used to olijcc tively select a “best” network for binary classification prol>leiii,,. ‘lhe technique can be extended to prolileiiis Ivi t l i aii arliitrat!. number of classes.

[1]  M. Gutierrez,et al.  Estimating hidden unit number for two-layer perceptrons , 1989, International 1989 Joint Conference on Neural Networks.

[2]  H. Akaike A new look at the statistical model identification , 1974 .

[3]  S. Y. Kung,et al.  An algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learning , 1988, IEEE 1988 International Conference on Neural Networks.