Offline Training for Memristor-based Neural Networks

Neuromorphic systems based on Hardware Neural Networks (HNN) are expected to be an energy-efficient computing architecture for solving complex tasks. Due to the variability common to all nano-electronic devices, HNN success depends on the development of reliable weight storage or mitigation techniques against weight variation. In this manuscript, we propose a neural network training technique to mitigate the impact of device-to-device variation due to conductance imperfections at weight import in offline-learning. To that aim, we propose to add said variation to the weights during training in order to force the network to learn robust computations against that variation. Then, we experiment using a neural network architecture with quantized weights adapted to the design constrains imposed by memristive devices. Finally, we validate our proposal against real-world road traffic data and the MNIST image data set, showing improvements on the classification metrics.

[1]  Enrique Miranda,et al.  Memristors for Neuromorphic Circuits and Artificial Intelligence Applications , 2020, Materials.

[2]  Yoshua Bengio,et al.  BinaryConnect: Training Deep Neural Networks with binary weights during propagations , 2015, NIPS.

[3]  Adolf D. May,et al.  Traffic Flow Fundamentals , 1989 .

[4]  Mohammed E. Fouda,et al.  Mask Technique for Fast and Efficient Training of Binary Resistive Crossbar Arrays , 2019, IEEE Transactions on Nanotechnology.

[5]  G. W. Burr,et al.  Experimental demonstration and tolerancing of a large-scale neural network (165,000 synapses), using phase-change memory as the synaptic weight element , 2015, 2014 IEEE International Electron Devices Meeting.

[6]  Pierre-Emmanuel Gaillardon,et al.  A Robust Digital RRAM-Based Convolutional Block for Low-Power Image Processing and Learning Applications , 2019, IEEE Transactions on Circuits and Systems I: Regular Papers.

[7]  Fabien Alibart,et al.  Pattern classification by memristive crossbar circuits using ex situ and in situ training , 2013, Nature Communications.

[8]  Sungho Kim,et al.  Impact of Synaptic Device Variations on Classification Accuracy in a Binarized Neural Network , 2019, Scientific Reports.

[9]  Kyeong-Sik Min,et al.  Asymmetrical Training Scheme of Binary-Memristor-Crossbar-Based Neural Networks for Energy-Efficient Edge-Computing Nanoscale Systems , 2019, Micromachines.

[10]  Sungho Kim,et al.  Impact of Synaptic Device Variations on Pattern Recognition Accuracy in a Hardware Neural Network , 2018, Scientific Reports.

[11]  Hyunsang Hwang,et al.  Multi-state resistance switching and variability analysis of HfOx based RRAM for ultra-high density memory applications , 2015, 2015 International Symposium on Next-Generation Electronics (ISNE).

[12]  Bin Liu,et al.  Ternary Weight Networks , 2016, ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[13]  Dah-Jye Lee,et al.  A Review of Binarized Neural Networks , 2019, Electronics.

[14]  Rajiv V. Joshi,et al.  An Energy-Efficient Digital ReRAM-Crossbar-Based CNN With Bitwise Parallelism , 2017, IEEE Journal on Exploratory Solid-State Computational Devices and Circuits.

[15]  Teresa Pamula,et al.  Road Traffic Conditions Classification Based on Multilevel Filtering of Image Content Using Convolutional Neural Networks , 2018, IEEE Intelligent Transportation Systems Magazine.

[16]  Sanghyeon Choi,et al.  Memristor Synapses for Neuromorphic Computing , 2019, Memristors - Circuits and Applications of Memristor Devices [Working Title].