A Binarized Neural Network Accelerator with Differential Crosspoint Memristor Array for Energy-Efficient MAC Operations
暂无分享,去创建一个
Binarized Neural Networks (BNN) significantly reduce computational complexity and relax memory requirements with binarized weights and activations. We propose a differential crosspoint (DX) memristor array for enabling parallel multiply-and-accumulate (MAC) operations in BNN to further improve the efficiency. Two differential memristors compose one synapse. The synapses on the same column form a voltage divider in which the output voltage corresponds linearly to the digital summation. The analog output voltage is then quantized to 4-bit output by a voltage sense amplifier. A small 64×64 DX array in every DX unit (DXU) minimizes parasitic resistance and capacitance for quicker MAC operations. A system architecture using DXUs for BNN acceleration is introduced. A wide range of BNN models can be mapped to an array of DXUs. To further reduce the energy spent on data movement, a neighbor shifting scheme increases the input data reusability. The effects of quantization and bit errors are investigated by running MNIST and CFAR-10 datasets. A DXU is able to achieve an estimated energy efficiency of 160 TMAC/s/W.