Improving the Robustness of Neural Networks to Noisy Multi-Level Non-Volatile Memory-based Synapses

The implementation of Artificial Neural Networks (ANNs) using analog Non-Volatile Memories (NVMs) for synaptic weights storage promises improved energy-efficiency and higher density compared to fully-digital implementations. However, NVMs are prone to variability, resulting in a degradation of the accuracy of ANNs. In this paper, a general methodology to evaluate and enhance the accuracy of neural networks implemented with non-ideal multi-level NVMs is presented. A hardware fault model distinguishing two types of errors, namely static and dynamic, capturing the variability of NVMs is proposed. Considering various neural networks, it is shown that error-aware training highly increases the robustness to errors compared to a standard, error-agnostic, training. Moreover, Recurrent NNs (RNNs) and Spiking NNs (SNNs) are found to be inherently more robust to dynamic errors than Convolutional NNs (CNNs). In addition, new insights on the adaptability of neural networks to noisy multi-level NVMs are presented, which could further improve their robustness in this context. The methodology aims at providing tools for hardware-software co-design, paving the way for a broader use of multi-level NVM-based synapses.

[1]  L. Anghel,et al.  Are SNNs Really More Energy-Efficient Than ANNs? an In-Depth Hardware-Aware Study , 2023, IEEE Transactions on Emerging Topics in Computational Intelligence.

[2]  G. Molas,et al.  Investigation of resistance fluctuations in ReRAM: physical origin, temporal dependence and impact on memory reliability , 2023, 2023 IEEE International Reliability Physics Symposium (IRPS).

[3]  Clemens J. S. Schaefer,et al.  A compute-in-memory chip based on resistive random-access memory , 2022, Nature.

[4]  G. Molas,et al.  1S1R Optimization for High‐Frequency Inference on Binarized Spiking Neural Networks , 2022 .

[5]  P. Panda,et al.  Examining the Robustness of Spiking Neural Networks on Non-ideal Memristive Crossbars , 2022, ISLPED.

[6]  K. Higuchi,et al.  Investigation of Memory Non-Ideality Impacts on Non-Volatile Memory Based Computation-in-Memory AI Inference by Comprehensive Simulation Platform , 2022, IEEE Silicon Nanoelectronics Workshop.

[7]  L. Anghel,et al.  MOZART+: Masking Outputs With Zeros for Improved Architectural Robustness and Testing of DNN Accelerators , 2022, IEEE Transactions on Device and Materials Reliability.

[8]  X. Hu,et al.  On the Reliability of Computing-in-Memory Accelerators for Deep Neural Networks , 2022, ArXiv.

[9]  Y. Chih,et al.  An 8-Mb DC-Current-Free Binary-to-8b Precision ReRAM Nonvolatile Computing-in-Memory Macro using Time-Space-Readout with 1286.4-21.6TOPS/W for Edge-AI Devices , 2022, 2022 IEEE International Solid- State Circuits Conference (ISSCC).

[10]  Seung Keun Yoon,et al.  A crossbar array of magnetoresistive memory devices for in-memory computing , 2022, Nature.

[11]  Zhonghai Lu,et al.  The Impact of Faults on DNNs: A Case Study , 2021, 2021 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT).

[12]  Nan Du,et al.  Towards Reliable In-Memory Computing:From Emerging Devices to Post-von-Neumann Architectures , 2021, 2021 IFIP/IEEE 29th International Conference on Very Large Scale Integration (VLSI-SoC).

[13]  G. Kar,et al.  Multi-pillar SOT-MRAM for Accurate Analog in-Memory DNN Inference , 2021, 2021 Symposium on VLSI Technology.

[14]  Chen Li,et al.  Robustness to Noisy Synaptic Weights in Spiking Neural Networks , 2020, 2020 International Joint Conference on Neural Networks (IJCNN).

[15]  V. Sze,et al.  Efficient Processing of Deep Neural Networks , 2020, Synthesis Lectures on Computer Architecture.

[16]  Vivienne Sze,et al.  Design Considerations for Efficient Deep Neural Networks on Processing-in-Memory Accelerators , 2019, 2019 IEEE International Electron Devices Meeting (IEDM).

[17]  S. Ambrogio,et al.  Emerging neuromorphic devices , 2019, Nanotechnology.

[18]  Sumon Kumar Bose,et al.  Is my Neural Network Neuromorphic? Taxonomy, Recent Trends and Future Directions in Neuromorphic Engineering , 2019, 2019 53rd Asilomar Conference on Signals, Systems, and Computers.

[19]  Evangelos Eleftheriou,et al.  Accurate deep neural network inference using computational phase-change memory , 2019, Nature Communications.

[20]  Elena I. Vatajelu,et al.  Special Session: Reliability of Hardware-Implemented Spiking Neural Networks (SNN) , 2019, 2019 IEEE 37th VLSI Test Symposium (VTS).

[21]  Jacques-Olivier Klein,et al.  Outstanding Bit Error Tolerance of Resistive RAM-Based Binarized Neural Networks , 2019, 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS).

[22]  Dongsuk Jeon,et al.  Enhancing Reliability of Analog Neural Network Processors , 2019, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[23]  Pete Warden,et al.  Speech Commands: A Dataset for Limited-Vocabulary Speech Recognition , 2018, ArXiv.

[24]  Yoshua Bengio,et al.  Light Gated Recurrent Units for Speech Recognition , 2018, IEEE Transactions on Emerging Topics in Computational Intelligence.

[25]  Shimeng Yu,et al.  Investigation of statistical retention of filamentary analog RRAM for neuromophic computing , 2017, 2017 IEEE International Electron Devices Meeting (IEDM).

[26]  G. Cibrario,et al.  Fundamental variability limits of filament-based RRAM , 2016, 2016 IEEE International Electron Devices Meeting (IEDM).

[27]  D Ielmini,et al.  Multiple Memory States in Resistive Switching Devices Through Controlled Size and Orientation of the Conductive Filament , 2013, Advanced materials.

[28]  I. Karpov,et al.  Fundamental drift of parameters in chalcogenide phase change memory , 2007 .

[29]  Y.C. Chen,et al.  Write Strategies for 2 and 4-bit Multi-Level Phase-Change Memory , 2007, 2007 IEEE International Electron Devices Meeting.

[30]  Alan F. Murray,et al.  Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training , 1994, IEEE Trans. Neural Networks.

[31]  Carver A. Mead,et al.  Neuromorphic electronic systems , 1990, Proc. IEEE.

[32]  L. Anghel,et al.  Investigating Current-Based and Gating Approaches for Accurate and Energy-Efficient Spiking Recurrent Neural Networks , 2022, ICANN.

[33]  Yuan Xie,et al.  Rethinking the performance comparison between SNNS and ANNS , 2020, Neural Networks.