High capacity pattern recognition associative processors

We distinguish between many:1 (distortion-invariant) and 1:1 (large class) pattern recognition associative processors (with many different input keys associated with the same output recollection vector and with each key associated with a different recollection vector). A variety of different associative processor synthesis algorithms are compared showing that one can: store M vector pairs (where M > N, and N is the dimension of the keys) in fewer memory elements than standard digital storage requires; handle linearly dependent key vectors; and achieve robust noise performance and quantization by design. We show that one must employ new recollection vector encoding techniques to improve storage density, else the standard direct storage nearest neighbor processor is preferable. We find Ho-Kashyap associative processors and L-max recollection vector encoding to be preferable and we suggest new and preferable performance measures for associative processors.

[1]  David Casasent,et al.  Ho-Kashyap content-addressable associative processors , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[2]  D Casasent,et al.  Error-correction coding in an associative processor. , 1987, Applied optics.

[3]  Shu Lin,et al.  Error control coding : fundamentals and applications , 1983 .

[4]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[5]  David Casasent,et al.  Iterative Fisher/minimum-variance optical classifier , 1990, Pattern Recognit..

[6]  D Casasent,et al.  Key and recollection vector effects on heteroassociative memory performance. , 1989, Applied optics.

[7]  BART KOSKO,et al.  Bidirectional associative memories , 1988, IEEE Trans. Syst. Man Cybern..

[8]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[9]  Gardiner S. Stiles,et al.  On the Effect of Noise on the Moore-Penrose Generalized Inverse Associative Memory , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[11]  Tsunehiro Aibara,et al.  An Improvement on the Moore-Penrose Generalized Inverse Associative Memory , 1987, IEEE Transactions on Systems, Man, and Cybernetics.

[12]  Tsunehiro Aibara,et al.  Least squares associative memory and a theoretical comparison of its performance , 1989, IEEE Trans. Syst. Man Cybern..

[13]  D Casasent,et al.  Updating optical pseudoinverse associative memories. , 1989, Applied optics.

[14]  R J Marks Ii,et al.  Composite matched filtering with error correction. , 1987, Optics letters.

[15]  George G. Lendaris,et al.  Diffraction-pattern sampling for automatic pattern recognition , 1970 .

[16]  David Casasent,et al.  Ho-Kashyap Associative Processors , 1989, Other Conferences.

[17]  D P Casasent,et al.  Ho-Kashyap optical associative processors. , 1990, Applied optics.

[18]  David Casasent,et al.  Optical Associative Processors For Visual Perception" , 1988, Photonics West - Lasers and Applications in Science and Engineering.

[19]  B V Kumar,et al.  Evaluation of the use of the Hopfield neural network model as a nearest-neighbor algorithm. , 1986, Applied optics.

[20]  Jim Austin,et al.  Distributed associative memory for use in scene analysis , 1987, Image Vis. Comput..

[21]  David Casasent,et al.  Ho-Kashyap advanced pattern-recognition heteroassociative processors , 1990, Optics & Photonics.