Bayesian neural network enhancing reliability against conductance drift for memristor neural networks

The hardware implementation of neural networks based on memristor crossbar array provides a promising paradigm for neuromorphic computing. However, the existence of memristor conductance drift harms the reliability of the deployed neural network, which seriously hinders the practical application of memristor-based neuromorphic computing. In this paper, the impact of different types of conductance drift on the weight realized by memristors is investigated and analyzed. Then, utilizing the weight uncertainty introduced by conductance drift, we propose a weight optimization method based on the Bayesian neural network, which can greatly improve the network performance. Furthermore, an ensemble approach is proposed to enhance network reliability without increasing training cost or crossbar array resources. Finally, the effectiveness of the proposed scheme is verified through a series of experiments. In addition, the proposed scheme can be easily integrated into the implementation of neuromorphic computing, which can provide a better guarantee for its large-scale application.

[1]  Bin Gao,et al.  Fully hardware-implemented memristor convolutional neural network , 2020, Nature.

[2]  Yiran Chen,et al.  A closed-loop design to enhance weight stability of memristor based neural network chips , 2017, 2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD).

[3]  Julien Cornebise,et al.  Weight Uncertainty in Neural Networks , 2015, ArXiv.

[4]  Jouko Lampinen,et al.  Bayesian approach for neural networks--review and case studies , 2001, Neural Networks.

[5]  Yiran Chen,et al.  Challenges of memristor based neuromorphic computing system , 2017, Science China Information Sciences.

[6]  H-S Philip Wong,et al.  Memory leads the way to better computing. , 2015, Nature nanotechnology.

[7]  Yiran Chen,et al.  A quantization-aware regularized learning method in multilevel memristor-based neuromorphic computing system , 2017, 2017 IEEE 6th Non-Volatile Memory Systems and Applications Symposium (NVMSA).

[8]  David Mackay,et al.  Probable networks and plausible predictions - a review of practical Bayesian methods for supervised neural networks , 1995 .

[9]  H.-S. Philip Wong,et al.  Face classification using electronic synapses , 2017, Nature Communications.

[10]  Shukai Duan,et al.  A general memristor-based pulse coupled neural network with variable linking coefficient for multi-focus image fusion , 2018, Neurocomputing.

[11]  A. Sebastian,et al.  Compressed sensing recovery using computational memory , 2017, 2017 IEEE International Electron Devices Meeting (IEDM).

[12]  Heiga Zen,et al.  WaveNet: A Generative Model for Raw Audio , 2016, SSW.

[13]  Shukai Duan,et al.  A Memristive Multilayer Cellular Neural Network With Applications to Image Processing , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[14]  Geoffrey E. Hinton,et al.  Bayesian Learning for Neural Networks , 1995 .

[15]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[16]  Gökmen Tayfun,et al.  Acceleration of Deep Neural Network Training with Resistive Cross-Point Devices: Design Considerations , 2016, Front. Neurosci..

[17]  Abu Sebastian,et al.  Tutorial: Brain-inspired computing using phase-change memory devices , 2018, Journal of Applied Physics.

[18]  Fabien Alibart,et al.  Pattern classification by memristive crossbar circuits using ex situ and in situ training , 2013, Nature Communications.

[19]  Donglian Qi,et al.  Hybrid dual-complementary metal-oxide-semiconductor/memristor synapse-based neural network with its applications in image super-resolution , 2019, IET Circuits Devices Syst..

[20]  J. Yang,et al.  Memristive crossbar arrays for brain-inspired computing , 2019, Nature Materials.

[21]  Chuandong Li,et al.  Memristor-based RRAM with applications , 2012, Science China Information Sciences.

[22]  E. Eleftheriou,et al.  A phase-change memory model for neuromorphic computing , 2018, Journal of Applied Physics.

[23]  Shimeng Yu,et al.  Metal–Oxide RRAM , 2012, Proceedings of the IEEE.

[24]  Dit-Yan Yeung,et al.  Towards Bayesian Deep Learning: A Survey , 2016, ArXiv.

[25]  Wenqiang Zhang,et al.  Sign backpropagation: An on-chip learning algorithm for analog RRAM neuromorphic computing systems , 2018, Neural Networks.

[26]  Zhiwei Li,et al.  Binary neural network with 16 Mb RRAM macro chip for classification and online training , 2016, 2016 IEEE International Electron Devices Meeting (IEDM).

[27]  Charles M. Bishop,et al.  Ensemble learning in Bayesian neural networks , 1998 .

[28]  S. Ambrogio,et al.  Reducing the Impact of Phase-Change Memory Conductance Drift on the Inference of large-scale Hardware Neural Networks , 2019, 2019 IEEE International Electron Devices Meeting (IEDM).

[29]  David M. Blei,et al.  Variational Inference: A Review for Statisticians , 2016, ArXiv.

[30]  S. Ambrogio,et al.  Confined PCM-based Analog Synaptic Devices offering Low Resistance-drift and 1000 Programmable States for Deep Learning , 2019, 2019 Symposium on VLSI Technology.

[31]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[32]  George Saon,et al.  The IBM 2015 English conversational telephone speech recognition system , 2015, INTERSPEECH.

[33]  Shukai Duan,et al.  Easily Cascaded Memristor-CMOS Hybrid Circuit for High-Efficiency Boolean Logic Implementation , 2018, Int. J. Bifurc. Chaos.

[34]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[35]  Geoffrey E. Hinton,et al.  Keeping the neural networks simple by minimizing the description length of the weights , 1993, COLT '93.