Analog Storage of Adjustable Synaptic Weights

The most important specific problem faced in the analog implementation of neural networks is the storage of synaptic weights. Each storage cell should be very dense since one of them is needed for every synapse. The weight must be adjustable to provide an on-chip learning capability. Absolute precision is not usually needed since most learning algorithms are based on successive corrections of the weight values in a closed loop system.

[1]  S. Tam,et al.  An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses , 1990, International 1989 Joint Conference on Neural Networks.

[2]  Bruce A. Wooley,et al.  A 10-bit video BiCMOS track-and-hold amplifier , 1989 .

[3]  L. Carley,et al.  Trimming analog circuits using floating-gate analog MOS memory , 1989, IEEE International Solid-State Circuits Conference, 1989 ISSCC. Digest of Technical Papers.

[4]  W. Guggenbuhl,et al.  An analog trimming circuit based on a floating-gate device , 1988 .

[5]  Danny Cohen,et al.  MOSIS - The ARPA Silicon Broker , 1981 .

[6]  Eric A. Vittoz,et al.  Charge injection in analog MOS switches , 1987 .