A VLSI neural network with capacitive synapses

A VLSI architecture for artificial neural networks described. The example shown presents an Hopfield's fully interconnected network, but the same design can be adapted to any neural architecture. In comparison with designs using resistors or current sources to realize the synapses, this architecture offers several advantages: the accuracy that can be reached with capacitors is increased, and the number of synapses that can be connected to the same neuron is greater. Furthermore, only a few transistors are imbedded in each synapse; and since only the relative values of the capacitors are important, their size can be reduced to very small values. The total area occupied by one synapse is thus small too, and a great number of them can be integrated on the same chip. While the minimum values for the capacitors to avoid interference by parasitic ones are still to be determined, estimations show that a density of about 40-50 synapses/mm/sup 2/ in a 3-micron technology can be considered. >