Analog hardware implementation issues in deterministic Boltzmann machines

The deterministic Boltzmann machine (DBM) is a neural network architecture that has a simple, local learning rule, making it an ideal candidate for implementation in massively parallel digital or analog VLSI hardware. The ability of the learning rule to compensate for hardware deficiencies simplifies a hardware implementation. This paper explores the effects of various nonideal analog hardware characteristics and explains which of these effects are automatically compensated for by the learning algorithm, and which must be carefully avoided during hardware design. It is found that moderate levels of most nonideal characteristics are tolerated well. However, in the case of continuous learning of capacitive weights, the DBM cannot tolerate weight storage capacitor charge decay and offsets in the learning circuitry because they may cause the synaptic weight values to drift, resulting in oscillation. These effects can be alleviated by incorporating learning thresholds into the adaptation circuitry, at the expense of increased residual errors.