Regularization effect of weight initialization in back propagation networks
暂无分享,去创建一个
Complexity control of a learning method is critical for obtaining good generalization with finite training data. We discuss complexity control in multilayer perceptron (MLP) networks trained via backpropagation. For such networks, the number of hidden units and/or network weights is usually used as a complexity parameter. However, application of backpropagation training introduces additional mechanisms for complexity control. These mechanisms are implicit in the implementation of an optimization procedure, and they cannot be easily quantified (in contrast to the number of weights or the number of hidden units). We suggest using the framework of statistical learning theory to explain the effect of weight initialization. Using this framework, we demonstrate the effect of weight initialization on complexity control in MLP networks.
[1] Timothy Masters,et al. Practical neural network recipes in C , 1993 .
[2] M.H. Hassoun,et al. Fundamentals of Artificial Neural Networks , 1996, Proceedings of the IEEE.