Storing Object-Dependent Sparse Codes in a Willshaw Associative Network

Willshaw networks are single-layered neural networks that store associations between binary vectors. Using only binary weights, these networks can be implemented efficiently to store large numbers of patterns and allow for fault-tolerant recovery of those patterns from noisy cues. However, this is only the case when the involved codes are sparse and randomly generated. In this letter, we use a recently proposed approach that maps visual patterns into informative binary features. By doing so, we manage to transform MNIST handwritten digits into well-distributed codes that we then store in a Willshaw network in autoassociation. We perform experiments with both noisy and noiseless cues and verify a tenuous impact on the recovered pattern's relevant information. More specifically, we were able to perform retrieval after filling the memory to several factors of its number of units while preserving the information of the class to which the pattern belongs.

[1]  Andreas Knoblauch,et al.  Optimal Matrix Compression Yields Storage Capacity 1 for Binary Willshaw Associative Memory , 2003, ICANN.

[2]  Karl Steinbuch,et al.  Die Lernmatrix , 2004, Kybernetik.

[3]  Bruno B Averbeck,et al.  Representing spatial relationships in posterior parietal cortex: single neurons code object-referenced position. , 2007, Cerebral cortex.

[4]  G. Palm,et al.  On associative memory , 2004, Biological Cybernetics.

[5]  Vincent Gripon,et al.  Nearest Neighbour Search using binary neural networks , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[6]  S. P. Lloyd,et al.  Least squares quantization in PCM , 1982, IEEE Trans. Inf. Theory.

[7]  Günther Palm,et al.  Memory Capacities for Synaptic and Structural Plasticity G ¨ Unther Palm , 2022 .

[8]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[9]  David J. Field,et al.  What Is the Goal of Sensory Coding? , 1994, Neural Computation.

[10]  Terrence J Sejnowski,et al.  Communication in Neuronal Networks , 2003, Science.

[11]  Andreas Wichert,et al.  Attention Inspired Network: Steep learning curve in an invariant pattern recognition model , 2019, Neural Networks.

[12]  Andreas Knoblauch,et al.  Neural Associative Memory with Optimal Bayesian Learning , 2011, Neural Computation.

[13]  Bruno A Olshausen,et al.  Sparse coding of sensory inputs , 2004, Current Opinion in Neurobiology.

[14]  Andreas Knoblauch,et al.  Zip nets: Efficient associative computation with binary synapses , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[15]  Günther Palm,et al.  On the Information Storage Capacity of Local Learning Rules , 1992, Neural Computation.

[16]  G. Palm On the storage capacity of an associative memory with randomly distributed storage elements , 2004, Biological Cybernetics.

[17]  F. Sommer,et al.  Neural Associative Memories , 1993 .

[18]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[19]  Vincent Gripon,et al.  Associative Memories to Accelerate Approximate Nearest Neighbor Search , 2016, ArXiv.

[20]  H B Barlow,et al.  Single units and sensation: a neuron doctrine for perceptual psychology? , 1972, Perception.

[21]  J. Knott The organization of behavior: A neuropsychological theory , 1951 .

[22]  Yoshua Bengio,et al.  Convolutional networks for images, speech, and time series , 1998 .

[23]  Shun-ichi Amari,et al.  Characteristics of sparsely encoded associative memory , 1989, Neural Networks.

[24]  William B. Levy,et al.  Energy Efficient Neural Codes , 1996, Neural Computation.

[25]  P. Lennie The Cost of Cortical Computation , 2003, Current Biology.

[26]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.

[27]  Philip S. Yu,et al.  Top 10 algorithms in data mining , 2007, Knowledge and Information Systems.

[28]  Andreas Knoblauch,et al.  Neural Associative Memory and the Willshaw--Palm Probability Distribution , 2008, SIAM J. Appl. Math..

[29]  E. Gardner The space of interactions in neural network models , 1988 .

[30]  H. Barlow,et al.  Single Units and Sensation: A Neuron Doctrine for Perceptual Psychology? , 1972, Perception.

[31]  Andreas Knoblauch,et al.  Efficient Associative Computation with Discrete Synapses , 2016, Neural Computation.

[32]  Jean-Pierre Nadal,et al.  Information storage in sparsely coded memory nets , 1990 .