Analog CMOS deterministic Boltzmann circuits

CMOS circuits implementing an analog neural network (ANN) with on-chip deterministic Boltzmann learning (DBL) and capacitive synaptic weight storage have been designed, fabricated, and tested. Weights are refreshed by periodic repetition of the training data. The circuits were used to build a 12-neuron, 132-synapse ANN that performed well in a variety of learning experiments, including a 36-input to 4-output mapping problem. Adaptive systems such as those described here can compensate for imperfections in the components from which they are constructed and therefore can be built using simple silicon area-efficient analog circuits. The test results indicate that deterministic Boltzmann ANNs can be implemented efficiently using analog CMOS circuitry. >

[1]  Carsten Peterson,et al.  Explorations of the mean field theory learning algorithm , 1989, Neural Networks.

[2]  T. Yamada,et al.  A self-learning neural network chip with 125 neurons and 10 K self-organization synapses , 1990, Digest of Technical Papers., 1990 Symposium on VLSI Circuits.

[3]  Koichiro Mashiko,et al.  A 336-neuron, 28 K-synapse, self-learning neural network chip with branch-neuron-unit architecture , 1991 .

[4]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[5]  Barrie Gilbert A high-performance monolithic multiplier using active feedback , 1974 .

[6]  Geoffrey E. Hinton,et al.  Deterministic Boltzmann Learning in Networks with Asymmetric Connectivity , 1991 .

[7]  W. Hubbard,et al.  A programmable analog neural network chip , 1989 .

[8]  C. Schneider,et al.  Analogue CMOS Hebbian synapses , 1991 .

[9]  Javier R. Movellan,et al.  Contrastive Hebbian Learning in the Continuous Hopfield Model , 1991 .

[10]  Carsten Peterson,et al.  A Mean Field Theory Learning Algorithm for Neural Networks , 1987, Complex Syst..

[11]  C. Schneider,et al.  CMOS implementation of analog Hebbian synaptic learning circuits , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[12]  S. Tam,et al.  An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses , 1990, International 1989 Joint Conference on Neural Networks.

[13]  Geoffrey E. Hinton Using fast weights to deblur old memories , 1987 .

[14]  Geoffrey E. Hinton Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space , 1989, Neural Computation.

[15]  Robert B. Allen,et al.  Relaxation Networks for Large Supervised Learning Problems , 1990, NIPS.

[16]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.