G-PNN: A genetically engineered probabilistic neural network

Abstract The probabilistic neural network (PNN) is a neural network architecture that approximates the functionality of the Bayesian classifier, the optimal classifier. Designing the optimal Bayesian classifier is infeasible in practice, since the distributions of data belonging to each class are unknown. PNN is an approximation of the Bayesian classifier by approximating these distributions using the Parzen window approach. One of the criticisms of the PNN classifier is that, at times, it uses a lot of training data for its design. Furthermore, the PNN classifier requires that the user specifies certain network parameters, called the smoothing (spread) parameters, in order to approximate the distributions of the class data, which is not an easy task. A number of approaches have been reported in the literature for addressing both of these issues (i.e., reducing the number of training data needed for the building of the PNN model and producing good values for the smoothing parameters). In this effort, genetic algorithms are used to achieve both goals at once, and some promising results are reported.

[1]  Ming-Lei Tseng Integrating neural networks with influence diagrams for multiple sensor diagnostic systems , 1992 .

[2]  Michael R. Berthold,et al.  Constructive training of probabilistic neural networks , 1998, Neurocomputing.

[3]  D. F. Specht,et al.  Experience with adaptive probabilistic neural networks and adaptive general regression neural networks , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[4]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[5]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[6]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[7]  D. F. Specht,et al.  Enhancements to probabilistic neural networks , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[8]  Nicos G. Pavlidis,et al.  New Self-adaptive Probabilistic Neural Networks in Bioinformatic and Medical Tasks , 2006, Int. J. Artif. Intell. Tools.

[9]  T. Cacoullos Estimation of a multivariate density , 1966 .

[10]  Wei-Yin Loh,et al.  A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms , 2000, Machine Learning.

[11]  Florin Gorunescu,et al.  An evolutionary computational approach to probabilistic neural network with application to hepatic cancer diagnosis , 2005, 18th IEEE Symposium on Computer-Based Medical Systems (CBMS'05).

[12]  Mansooreh Mollaghasemi,et al.  GFAM: Evolving Fuzzy ARTMAP Neural Networks , 2007, FLAIRS.

[13]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[14]  Hans G. C. Tråvén,et al.  A neural network approach to statistical pattern classification by 'semiparametric' estimation of probability density functions , 1991, IEEE Trans. Neural Networks.

[15]  David G. Stork,et al.  Pattern Classification , 1973 .

[16]  Donald F. Specht,et al.  Probabilistic neural networks and the polynomial Adaline as complementary techniques for classification , 1990, IEEE Trans. Neural Networks.

[17]  Mansooreh Mollaghasemi,et al.  MO-GART: Multiobjective genetic ART architectures , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[18]  Pietro Burrascano,et al.  Learning vector quantization for the probabilistic neural network , 1991, IEEE Trans. Neural Networks.

[19]  Mansooreh Mollaghasemi,et al.  Genetically Engineered ART Architectures , 2007, Computational Intelligence Based on Lattice Theory.