Basic VLSI Circuits for Neural Networks

The most important basic circuits of neural networks are the memory cells for the adaptive weights, the complex connections between the neurons, and special circuits to support the learning. We present new circuits which implement these functions, and show concepts of functional integration in silicon. Additional circuits improve learning, e.g. an input layer with fuzzy logic or associative memory cells is preprocessing the information. The evaluation of these concepts reveals a better system performance and a higher integration level than that of pure digital concepts. A short outlook about new technologies based on silicon, as cooled circuits, optoelectronics, and molecular electronics, closes the overview.