The effect of weight precision and range on neural network classifier performance

Abstract This study investigates the precision and range requirements for weights in feed-forward neural network classifiers using backpropagation training with simulated signals where we could control the difficulty of the problem. We found that a precision of five bits and a range equal to four times the mean weight magnitude produced the same performance as continuous weights. However, we noted that the required precision and range depended strongly on the problem difficulty, the network complexity, and the relationship between these two factors. We also found that uniformly distributed discrete weights produced better performance than non-uniformly distributed ones.

[1]  Edward A. Rietman,et al.  Back-propagation learning and nonidealities in analog neural network hardware , 1991, IEEE Trans. Neural Networks.

[2]  Yun Xie,et al.  Analysis of the effects of quantization in multilayer neural networks using a statistical model , 1992, IEEE Trans. Neural Networks.

[3]  J. L. Holt,et al.  Back propagation simulations using limited precision calculations , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[4]  Daniele D. Caviglia,et al.  Effects of weight discretization on the back propagation learning method: algorithm design and hardware realization , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[5]  Francesco Piazza,et al.  Multi-layer perceptrons with discrete weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[6]  John J. Paulos,et al.  The Effects of Precision Constraints in a Backpropagation Learning Network , 1990, Neural Computation.