Improving the performance of probabilistic neural networks

A methodology for selection of appropriate widths or covariance matrices of the Gaussian functions in implementations of PNN (probabilistic neural network) classifiers is presented. The Gram-Schmidt orthogonalization process is employed to find these matrices. It has been shown that the proposed technique improves the generalization ability of the PNN classifiers over the standard approach. The result can be applied to other Gaussian-based classifiers such as the radial basis functions.<<ETX>>

[1]  Donald F. Specht,et al.  Probabilistic neural networks and the polynomial Adaline as complementary techniques for classification , 1990, IEEE Trans. Neural Networks.

[2]  T.P. Washburne,et al.  The Lockheed probabilistic neural network processor , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[3]  David J. Montana,et al.  A Weighted Probabilistic Neural Network , 1991, NIPS.

[4]  Donald F. Specht,et al.  Generation of Polynomial Discriminant Functions for Pattern Recognition , 1967, IEEE Trans. Electron. Comput..

[5]  Mohamad T. Musavi,et al.  On the training of radial basis function classifiers , 1992, Neural Networks.

[6]  D. F. Specht,et al.  The use of probabilistic neural networks to improve solution times for hull-to-emitter correlation problems , 1989, International 1989 Joint Conference on Neural Networks.

[7]  D. F. Specht,et al.  Generalization accuracy of probabilistic neural networks compared with backpropagation networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[8]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.