An Exponential Response Neural Net

By using artificial neurons with exponential transfer functions one can design perfect autoassociative and heteroassociative memory networks, with virtually unlimited storage capacity, for real or binary valued input and output. The autoassociative network has two layers: input and memory, with feedback between the two. The exponential response neurons are in the memory layer. By adding an encoding layer of conventional neurons the network becomes a heteroassociator and classifier. Because for real valued input vectors the dot-product with the weight vector is no longer a measure for similarity, we also consider a euclidean distance based neuron excitation and present Lyapunov functions for both cases. The network has energy minima corresponding only to stored prototype vectors. The exponential neurons make it simpler to build fast adaptive learning directly into classification networks that map real valued input to any class structure at its output.

[1]  Tzi-Dar Chiueh,et al.  VLSI Implementation of a Neural Associative Memory and its Application to Vector Quantization , 1990 .

[2]  Rodney M. Goodman,et al.  Recurrent correlation associative memories , 1991, IEEE Trans. Neural Networks.

[3]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.

[4]  Shlomo Geva,et al.  Adaptive nearest neighbor pattern classification , 1991, IEEE Trans. Neural Networks.

[5]  Teuvo Kohonen,et al.  STATISTICAL PATTERN RECOGNITION REVISITED , 1990 .

[6]  E. Gardner,et al.  Maximum Storage Capacity in Neural Networks , 1987 .

[7]  J. Sitte,et al.  Pattern classification by an exponental response neural net , 1991, Proceedings of the Twenty-Fourth Annual Hawaii International Conference on System Sciences.

[8]  Shlomo Geva,et al.  A pseudo-inverse neural net with storage capacity exceeding N , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[9]  Rodney M. Goodman,et al.  High-capacity exponential associative memories , 1988, IEEE 1988 International Conference on Neural Networks.