Hebbian plasticity in MOS synapses

Hebbian learning in analogue CMOS synapses is obtained by using the transistor characteristics to approximate the multiplicative correlation of neural signals. In situ analogue learning is employed, which means that computations of synaptic weight changes occur continuously during the normal operation of the artificial neural network. The transistor complexity of a synapse is minimised by departing from strict adherence to classical multiplicative rules; learning remains consistent, however, with the original qualitative statement of Hebb. Simulations of circuits with three transistors per synapse in the case of unipolar weights suggest that appropriate learning and forgetting behaviour is obtained at the synaptic level by adopting these area-efficient MOS learning rules in lieu of classical analytical formulations. The theory at the systems level corresponding to these learning rules has not yet been developed.

[1]  R. S. Withers,et al.  An artificial neural network integrated circuit based on MNOS/CCD principles , 1987 .

[2]  W. Hubbard,et al.  A programmable analog neural network chip , 1989 .

[3]  Teuvo Kohonen,et al.  An introduction to neural computing , 1988, Neural Networks.

[4]  L.D. Jackel,et al.  Analog electronic neural network circuits , 1989, IEEE Circuits and Devices Magazine.

[5]  Randy L. Shimabukuro,et al.  Dual-polarity nonvolatile MOS analogue memory (MAM) cell for neural-type circuitry , 1988 .

[6]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[7]  Barrie Gilbert A high-performance monolithic multiplier using active feedback , 1974 .

[8]  Carver Mead,et al.  Analog VLSI and neural systems , 1989 .

[9]  Howard C. Card,et al.  Vlsi Devices and Circuits for Neural Networks , 1989, Int. J. Neural Syst..

[10]  Geoffrey E. Hinton Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space , 1989, Neural Computation.

[11]  Howard C. Card,et al.  Silicon models of associative learning in Aplysia , 1990, Neural Networks.

[12]  Lex A. Akers,et al.  A Limited-Interconnect, Highly Layered Synthetic Neural Architecture , 1989 .

[13]  Yannis Tsividis,et al.  Analogue circuits for variable-synapse electronic neural networks , 1987 .

[14]  Alan F. Murray,et al.  Fully-Programmable Analogue VLSI Devices for the Implementation of Neural Networks , 1989 .

[15]  T. H. Brown,et al.  Biophysical model of a Hebbian synapse. , 1990, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[17]  Y. Tsividis Operation and modeling of the MOS transistor , 1987 .