Connection pruning with static and adaptive pruning schedules
暂无分享,去创建一个
[1] Martin A. Riedmiller,et al. A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.
[2] David E. Rumelhart,et al. Generalization by Weight-Elimination with Application to Forecasting , 1990, NIPS.
[3] Yann LeCun,et al. Optimal Brain Damage , 1989, NIPS.
[4] Gregory J. Wolff,et al. Optimal Brain Surgeon: Extensions and performance comparisons , 1993, NIPS 1993.
[5] Hervé Bourlard,et al. Generalization and Parameter Estimation in Feedforward Netws: Some Experiments , 1989, NIPS.
[6] Michael I. Jordan,et al. Advances in Neural Information Processing Systems 30 , 1995 .
[7] Geoffrey E. Hinton,et al. Simplifying Neural Networks by Soft Weight-Sharing , 1992, Neural Computation.
[8] Ferdinand Hergert,et al. Improving model selection by nonconvergent methods , 1993, Neural Networks.
[9] Lutz Prechelt,et al. PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .
[10] Christian Lebiere,et al. The Cascade-Correlation Learning Architecture , 1989, NIPS.
[11] Lutz Prechelt,et al. A quantitative study of experimental evaluations of neural network learning algorithms: Current research practice , 1996, Neural Networks.
[12] Peter M. Williams,et al. Bayesian Regularization and Pruning Using a Laplace Prior , 1995, Neural Computation.
[13] Scott E. Fahlman,et al. An empirical study of learning speed in back-propagation networks , 1988 .
[14] Babak Hassibi,et al. Second Order Derivatives for Network Pruning: Optimal Brain Surgeon , 1992, NIPS.
[15] Elie Bienenstock,et al. Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.