Regularization of hidden layer unit response for neural networks
暂无分享,去创建一个
In this paper, we looked into two issues in pattern recognition using neural networks trained by back propagation (BP), namely inefficient learning and insufficient generalization. We observed that these phenomena are partly caused by the way the hidden layer units responds to the inputs. In order to solve the issues, we introduced regularization of the hidden layer unit response which amounts to suppressing the correlation among the response of the hidden layer units, and pruning the unit with the method unit fusion. The results of using the proposed technique were compared with the case of conventional technique in pattern recognition problems. From the results of the experiments, the rate of correct recognition increased when using regularization in the hidden layer unit response is performed, and it turned out that the required number of training epochs also decreases.
[1] P. Werbos,et al. Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .
[2] Keisuke Kameyama,et al. Neural Network Model Switching for Efficient Feature Extraction , 1999 .
[3] Keisuke Kameyama,et al. Neural network pruning by fusing hidden layer units , 1991 .