Classification, Association and Pattern Completion using Neural Similarity Based Methods

A framework for Similarity-Based Methods (SBMs) includes many classification models as special cases: neural network of the Radial Basis Function Networks type, Feature Space Mapping neurofuzzy networks based on separable transfer functions, Learning Vector Quantization, variants of the k nearest neighbor methods and several new models that may be presented in a network form. Multilayer Perceptrons (MLPs) use scalar products to compute weighted activation of neurons, combining soft hyperplanes to provide decision borders. Distance-based multilayer perceptrons (D-MLPs) evaluate similarity of inputs to weights offering a natural generalization of standard MLPs . Cluster-based initialization procedure determining architecture and values of all adaptive parameters is described. Networks implementing SBM methods are useful not only for classification and approximation, but also as associative memories, in problems requiring pattern completion, offering an efficient way to deal with missing values. NonEuclidean distance functions may also be introduced by normalization of the input vectors in an extended feature space. Both approaches influence shapes of decision borders dramatically. An illustrative example showing these changes is provided.

[1]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[2]  L. N. Kanal,et al.  Handbook of Statistics, Vol. 2. Classification, Pattern Recognition and Reduction of Dimensionality. , 1985 .

[3]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[4]  Wlodzislaw Duch,et al.  A new methodology of extraction, optimization and application of crisp and fuzzy logical rules , 2001, IEEE Trans. Neural Networks.

[5]  R. Miranda,et al.  Circular Nodes in Neural Networks , 1996, Neural Computation.

[6]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[7]  Antonio Bellacicco,et al.  Handbook of statistics 2: Classification, pattern recognition and reduction of dimensionality: P.R. KRISHNAIAH and L.N. KANAL (Eds.) North-Holland, Amsterdam, 1982, xxii + 903 pages, Dfl.275.00 , 1984 .

[8]  Krzysztof Grabczewski,et al.  Extraction of logical rules from backpropagation networks , 1998 .

[9]  Jean-Pierre Martens,et al.  On the initialization and optimization of multilayer perceptrons , 1994, IEEE Trans. Neural Networks.

[10]  Norbert Jankowski,et al.  Survey of Neural Transfer Functions , 1999 .

[11]  D. Rubin Multiple Imputation After 18+ Years , 1996 .

[12]  David J. Spiegelhalter,et al.  Machine Learning, Neural and Statistical Classification , 2009 .

[13]  Sandro Ridella,et al.  Circular backpropagation networks for classification , 1997, IEEE Trans. Neural Networks.

[14]  David M. Skapura,et al.  Building neural networks , 1995 .

[15]  G. Dorffner UNIFIED FRAMEWORK FOR MLPs AND RBFNs: INTRODUCING CONIC SECTION FUNCTION NETWORKS , 1994 .

[16]  Christopher J. Merz,et al.  UCI Repository of Machine Learning Databases , 1996 .

[17]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[18]  Włodzisław Duch,et al.  Neural minimal distance methods , 1997 .

[19]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[20]  Wlodzislaw Duch,et al.  Feature space mapping as a universal adaptive system , 1995 .