Neuromorphic features of probabilistic neural networks

We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible theoretical background of the functional properties of neurons. For example, the general formula for synaptical weights provides a statistical justification of the well known Hebbian principle of learning. Similarly, the mean effect of lateral inhibition can be expressed by means of a formula proposed by Perez as a measure of dependence tightness of involved variables.

[1]  Michael I. Jordan,et al.  On Convergence Properties of the EM Algorithm for Gaussian Mixtures , 1996, Neural Computation.

[2]  P. Deb Finite Mixture Models , 2008 .

[3]  Roy L. Streit,et al.  Maximum likelihood training of probabilistic neural networks , 1994, IEEE Trans. Neural Networks.

[4]  Pavel Pudil,et al.  Boosting in probabilistic neural networks , 2002, Object recognition supported by user interaction for service robots.

[5]  Jan T. Bialasiewicz Statistical data reduction via construction of sample space partitions , 1970, Kybernetika.

[6]  D. F. Specht,et al.  Probabilistic neural networks for classification, mapping, or associative memory , 1988, IEEE 1988 International Conference on Neural Networks.

[7]  Josef Kittler,et al.  Combining Multiple Classifiers in Probabilistic Neural Networks , 2000, Multiple Classifier Systems.

[8]  Jirí Grim Maximum-likelihood design of layered neural networks , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[9]  Kenji Fukumizu,et al.  Probabilistic design of layered neural networks based on their unified framework , 1995, IEEE Trans. Neural Networks.

[10]  Albert Perez Ε-admissible Simplifications of the Dependence Structure of a Set of Random Variables , 1977, Kybernetika.

[11]  Pavel Pudil,et al.  RECOGNITION OF HANDWRITTEN NUMERALS BY STRUCTURAL PROBABILISTIC NEURAL NETWORKS , 2008 .

[12]  J. Grim A Sequential Modification of EM Algorithm , 1999 .

[13]  Albert Perez,et al.  Information, ε-sufficiency and data reduction problems , 1965, Kybernetika.

[14]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[15]  Josef Kittler,et al.  Information Analysis of Multiple Classifier Fusion , 2001, Multiple Classifier Systems.

[16]  Pavel Pudil,et al.  Probabilistic neural network playing and learning Tic-Tac-Toe , 2005, Pattern Recognit. Lett..

[17]  Jirí Grim,et al.  Multivariate statistical pattern recognition with nonreduced dimensionality , 1986, Kybernetika.

[18]  DESIGN OF MULTILAYER NEURAL NETWORKS BY INFORMATION PRESERVING TRANSFORMS , 2008 .

[19]  Jirí Grim,et al.  On numerical evaluation of maximum-likelihood estimates for finite mixtures of distributions , 1982, Kybernetika.

[20]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[21]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.

[22]  Pavel Pudil,et al.  Probabilistic Neural Network Playing a Simple Game , 2022 .

[23]  Igor Vajda,et al.  About the maximum information and maximum likelihood principles , 1998, Kybernetika.

[24]  Josef Kittler,et al.  Multiple Classifier Fusion in Probabilistic Neural Networks , 2002, Pattern Analysis & Applications.