Silicon Integration of Learning Algorithms and Other Auto-Adaptive Properties in a Digital Feedback Neural Network

In the past few years, a lot of efforts has been devoted to the integration of neural networks [1,2], with much emphasis on the implementation of the network itself, leaving the burden of learning to a host, possibly parallel computer [3]. However, the idea of implementing training on the chip itself is attractive for two reasons: (i) the learning phase is usually very time-consuming; (ii) on-chip learning makes the network more autonomous and opens the way to building elaborate assemblies of networks. The present paper discusses the capabilities of a neural network chip, fully connected with feedback, using binary neurons with parallel synchronous dynamics, intended to be used as an associative memory [4]; the chip integrates a learning algorithm and also some additional, potentially useful features such as self identification of correct relaxation on a stored vector (a prototype), and to the discussion of the main silicon implementation issues.

[1]  Karl Goser,et al.  VLSI-Design of Associative Networks , 1989 .

[2]  S. Y. Kung,et al.  Parallel architectures for artificial neural nets , 1988, IEEE 1988 International Conference on Neural Networks.

[3]  R. S. Withers,et al.  An artificial neural network integrated circuit based on MNOS/CCD principles , 1987 .

[4]  Gérard Dreyfus,et al.  Single-layer learning revisited: a stepwise procedure for building and training a neural network , 1989, NATO Neurocomputing.

[5]  S. Tam,et al.  An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses , 1990, International 1989 Joint Conference on Neural Networks.

[6]  Alan F. Murray,et al.  Bit-Serial Neural Networks , 1987, NIPS.

[7]  L. Personnaz,et al.  Collective computational properties of neural networks: New learning mechanisms. , 1986, Physical review. A, General physics.

[8]  Robert B. Allen,et al.  Stochastic Learning Networks and their Electronic Implementation , 1987, NIPS.

[9]  Pierre Peretto,et al.  Stochastic Dynamics of Neural Networks , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[10]  Carver A. Mead,et al.  A novel associative memory implemented using collective computation , 1990 .

[11]  Sompolinsky,et al.  Neural networks with nonlinear synapses and a static noise. , 1986, Physical review. A, General physics.

[12]  Dimitris Anastassiou,et al.  Switched-capacitor neural networks , 1987 .

[13]  D. B. Schwartz,et al.  Dynamics of microfabricated electronic neural networks , 1987 .

[14]  Isabelle Guyon,et al.  Engineering applications of spin glass concepts , 1987 .

[15]  G. W. Bruyn Neural networks: Biological computers or electronic brains By R. Moreau et al. (eds.) Les Entretiens de Lyon, proceedings intern. conf. 1990, Springer-Verlag, ISBN 3-54059540-6 195 pages, DM 79.00, Berlin, Heidelberg, New York , 1991, Journal of the Neurological Sciences.

[16]  Opper,et al.  Learning of correlated patterns in spin-glass networks by local learning rules. , 1987, Physical review letters.

[17]  Michel Weinfeld,et al.  Integrated artificial neural networks: components for higher level architectures with new properties , 1989, NATO Neurocomputing.

[18]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.