Probabilistic Neural Networks with Rotated Kernel Functions

For the automatical determination of the “smoothing pararmeter” and the enhancement of the generalization ability of the standard Probabilistic Neural Network(PNN), a method to construct the covariance matrix of the Gaussian kernel functions of the training pattern is proposed. Based on the minimization of the local error, the constant potential surface of the Gaussian function provides two matrices: a rotation matrix and a matrix of variances, which are both combined to calculate the desired covariance matrix. The new approach was applied to the two spiral problem, where training was done with a reduced pattern set. The efectiveness is demonstrated in a comparison between the PNN and the new model on the generalization to the entire set.

[1]  Mohamad T. Musavi,et al.  On the Generalization Ability of Neural Network Classifiers , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  D. F. Specht,et al.  Generalization accuracy of probabilistic neural networks compared with backpropagation networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[3]  D. F. Specht,et al.  Probabilistic neural networks for classification, mapping, or associative memory , 1988, IEEE 1988 International Conference on Neural Networks.

[4]  G. H. You,et al.  ISBN recognition using a modified probabilistic neural network (PNN) , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[5]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.

[6]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .