Fault tolerance of pruned multilayer networks
暂无分享,去创建一个
Techniques for dynamically reducing the size of a neural network during learning have been found by some investigators to speed up learning convergence and improve network generalization. However, concern arises about the fault sensitivity of the pruned network relative to that of its parent. Work has been done to assess the tolerance of multilayer feedforward networks to the zeroing of individual weights, and to determine if network pruning during learning affects this tolerance. Multilayer networks having a single input and a single output were trained to produce the sine of the input value on the interval (- pi , pi ). Identical networks with identical initial weights were then trained using the skeletonization technique of Mozer and Smolensky (1989). Each weight in these networks was zeroed in turn, and the effect on the RMS approximation error was noted. Surprisingly, the unpruned networks, which had considerably more free parameters, were found to be no more tolerant to weight zeroing than the pruned networks, and maintaining a separate relevance estimate for each node was found to be unnecessary.<<ETX>>
[1] Michael C. Mozer,et al. Skeletonization: A Technique for Trimming the Fat from a Network via Relevance Assessment , 1988, NIPS.
[2] Michael J. Carter,et al. Operational Fault Tolerance of CMAC Networks , 1989, NIPS.
[3] Yann LeCun,et al. Optimal Brain Damage , 1989, NIPS.