The recent wave of interest in cellular structures known as neural networks is due, in large part, to the advent of a new model [1] which turned out to be amenable to analytical results with the tools of statistical mechanics [2]. In a surprisingly short period of time, one has gone a very long way from the initial model; in the present volume, H. Gutfreund presents a review of the basic concepts and of the most recent developments in this very fast-growing field. Apart from the theoretical interest involved in modeling the brain - or at least some functions of the brain - there is also a considerable interest from the point of view of applications to data processing. The present paper will focus essentially on the latter aspect of artificial neural networks. Associative memory is the basic function that can be performed by these systems; therefore, we shall present a general discussion of the concepts related to associative memory, applied to pattern recognition and error correction; various illustrative examples will be shown, and we shall discuss the basic issues in this context. We shall also mention recent developments in the storage and retrieval of sequences of pieces of information.
[1]
J J Hopfield,et al.
Neural networks and physical systems with emergent collective computational abilities.
,
1982,
Proceedings of the National Academy of Sciences of the United States of America.
[2]
I. Guyon,et al.
Information storage and retrieval in spin-glass like neural networks
,
1985
.
[3]
L. Personnaz,et al.
Collective computational properties of neural networks: New learning mechanisms.
,
1986,
Physical review. A, General physics.
[4]
W. Hubbard,et al.
Building a hierarchy with neural networks: an example-image vector quantization.
,
1987,
Applied optics.
[5]
I. Morgenstern,et al.
Heidelberg Colloquium on Glassy Dynamics
,
1987
.
[6]
Sompolinsky,et al.
Information storage in neural networks with low levels of activity.
,
1987,
Physical review. A, General physics.