Noise tolerant dendritic lattice associative memories

Linear classifiers based on computation over the real numbers R (e.g., with operations of addition and multiplication) denoted by (R, +, x), have been represented extensively in the literature of pattern recognition. However, a different approach to pattern classification involves the use of addition, maximum, and minimum operations over the reals in the algebra (R, +, maximum, minimum) These pattern classifiers, based on lattice algebra, have been shown to exhibit superior information storage capacity, fast training and short convergence times, high pattern classification accuracy, and low computational cost. Such attributes are not always found, for example, in classical neural nets based on the linear inner product. In a special type of lattice associative memory (LAM), called a dendritic LAM or DLAM, it is possible to achieve noise-tolerant pattern classification by varying the design of noise or error acceptance bounds. This paper presents theory and algorithmic approaches for the computation of noise-tolerant lattice associative memories (LAMs) under a variety of input constraints. Of particular interest are the classification of nonergodic data in noise regimes with time-varying statistics. DLAMs, which are a specialization of LAMs derived from concepts of biological neural networks, have successfully been applied to pattern classification from hyperspectral remote sensing data, as well as spatial object recognition from digital imagery. The authors' recent research in the development of DLAMs is overviewed, with experimental results that show utility for a wide variety of pattern classification applications. Performance results are presented in terms of measured computational cost, noise tolerance, classification accuracy, and throughput for a variety of input data and noise levels.

[1]  W. J. Nowack Methods in Neuronal Modeling , 1991, Neurology.

[2]  Vassilios Petridis,et al.  Fuzzy lattice neural network (FLNN): a hybrid model for learning , 1998, IEEE Trans. Neural Networks.

[3]  Paul D. Gader,et al.  Morphological shared-weight networks with applications to automatic target recognition , 1997, IEEE Trans. Neural Networks.

[4]  James M. Keller,et al.  Incorporating Fuzzy Membership Functions into the Perceptron Algorithm , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Jennifer L. Davidson,et al.  Morphology neural networks: An introduction with applications , 1993 .

[6]  Gerhard X. Ritter,et al.  Learning In Lattice Neural Networks that Employ Dendritic Computing , 2006, FUZZ-IEEE.

[7]  Gerhard X. Ritter,et al.  A New Auto-associative Memory Based on Lattice Algebra , 2004, CIARP.

[8]  P. Arabshahi Steepest Descent Adaptation of Min-Max Fuzzy If-Then Rules 1 , 1992 .

[9]  Patrick K. Simpson,et al.  Fuzzy min-max neural networks - Part 2: Clustering , 1993, IEEE Trans. Fuzzy Syst..

[10]  G. Ritter,et al.  HYPERSPECTRAL ENDMEMBER EXTRACTION AND SIGNATURE CLASSIFICATION WITH MORPHOLOGIAL NETWORKS , 2006 .

[11]  Paul D. Gader,et al.  Fixed Points of Lattice Transforms and Lattice Associative Memories , 2006 .

[12]  Gerhard X. Ritter,et al.  Lattice algebra approach to single-neuron computation , 2003, IEEE Trans. Neural Networks.

[13]  Gerhard X. Ritter,et al.  Autonomous single-pass endmember approximation using lattice auto-associative memories , 2009, Neurocomputing.

[14]  Gonzalo Urcid Transformations of neural inputs in lattice dendrite computation , 2005, SPIE Optics + Photonics.

[15]  Gerhard X. Ritter,et al.  Theory of morphological neural networks , 1990, Photonics West - Lasers and Applications in Science and Engineering.

[16]  Joseph N. Wilson,et al.  Handbook of computer vision algorithms in image algebra , 1996 .

[17]  P. K. Simpson Fuzzy Min-Max Neural Networks-Part 1 : Classification , 1992 .

[18]  Peter Sussner,et al.  Morphological associative memories , 1998, IEEE Trans. Neural Networks.

[19]  Vassilis G. Kaburlasos,et al.  Improved fuzzy lattice neurocomputing (FLN) for semantic neural computing , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[20]  Peter Sussner,et al.  Associative memories based on lattice algebra , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[21]  Ning Qian,et al.  Synaptic integration by electro-diffusion in dendritic spines , 1992 .

[22]  H. Longuet-Higgins Understanding the Brain , 1968, Nature.

[23]  Gerhard X. Ritter,et al.  A lattice matrix method for hyperspectral image unmixing , 2011, Inf. Sci..

[24]  Peter Sussner,et al.  Morphological bidirectional associative memories , 1999, Neural Networks.

[25]  Chang Chieh Hang,et al.  The min-max function differentiation and training of fuzzy neural networks , 1996, IEEE Trans. Neural Networks.

[26]  Carmen Paz Suárez Araujo,et al.  Novel Neural Network Models for Computing Homothetic Invariances: An Image Algebra Notation , 2004, Journal of Mathematical Imaging and Vision.

[27]  Gerhard X. Ritter,et al.  Morphological perceptrons with dendritic structure , 2003, The 12th IEEE International Conference on Fuzzy Systems, 2003. FUZZ '03..