Sensitivity analysis of single hidden-layer neural networks with threshold functions

An important consideration when applying neural networks to pattern recognition is the sensitivity to weight perturbation or to input errors. In this paper, we analyze the sensitivity of single hidden-layer networks with threshold functions. In a case of weight perturbation or input errors, the probability of inversion error for an output neuron is derived as a function of the trained weights, the input pattern, and the variance of weight perturbation or the bit error probability of the input pattern. The derived results are verified with a simulation of the Madaline recognizing handwritten digits. The result shows that the sensitivity of trained networks is far different from that of networks with random weights.

[1]  Athanasios Papoulis,et al.  Probability, Random Variables and Stochastic Processes , 1965 .

[2]  Bruno O. Shubert,et al.  Random variables and stochastic processes , 1979 .

[3]  Bernard Widrow,et al.  Layered neural nets for pattern recognition , 1988, IEEE Trans. Acoust. Speech Signal Process..

[4]  E. Paek,et al.  Influence of interconnection weight discretization and noise in an optoelectronic neural network. , 1989, Optics letters.

[5]  Myung Won Kim,et al.  Analysis of classification performance for Hopfield network with predefined correlated exemplar patterns , 1990, IJCNN.

[6]  Bernard Widrow,et al.  30 years of adaptive neural networks: perceptron, Madaline, and backpropagation , 1990, Proc. IEEE.

[7]  Bernard Widrow,et al.  Sensitivity of feedforward neural networks to weight errors , 1990, IEEE Trans. Neural Networks.

[8]  Y. Xie,et al.  Analysis of effects of quantisation in multilayer neural networks using statistical model , 1991 .

[9]  Yoshifusa Ito,et al.  Approximation of continuous functions on Rd by linear combinations of shifted rotations of a sigmoid function with and without scaling , 1992, Neural Networks.

[10]  Chong-Ho Choi,et al.  Sensitivity analysis of multilayer perceptron with differentiable activation functions , 1992, IEEE Trans. Neural Networks.

[11]  Yun Xie,et al.  Analysis of the effects of quantization in multilayer neural networks using a statistical model , 1992, IEEE Trans. Neural Networks.

[12]  Chita R. Das,et al.  Performance of multilayer neural networks in binary-to-binary mappings under weight errors , 1993, IEEE International Conference on Neural Networks.

[13]  Sang-Hoon Oh,et al.  Effect of nonlinear transformations on correlation between weighted sums in multilayer perceptrons , 1994, IEEE Trans. Neural Networks.