Generalized Convolution Simulation Stack for RRAM Device based Deep Learning Neural Network

In this work, a generalized simulation software (HDL - Hardware Description Language) stack for convolution operation, used to analyze any RRAM device performance while plugged into a deep learning network, is constructed and simulated. This HDL software stack is a hardware abstracted matrix convolution implementation, used by RRAM device development engineers to quickly plug the device parameters into this framework and generate an application-level simulated prediction. The design output from this framework can be used to compute the area and power impact for any given end application based on the configuration, material and structure of the RRAM. A Verilog based HDL program is developed to build gate-level 32-bit floating-point adder and multiplier and in turn, these two arithmetic modules are connected hierarchically to perform configurable 1 × 1 to 11 × 11 (32-bit) parallel matrix computations used in the deep learning network.

[1]  Rolf Drechsler,et al.  Logic Synthesis for RRAM-Based In-Memory Computing , 2018, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.

[2]  Gregory S. Snider,et al.  ‘Memristive’ switches enable ‘stateful’ logic operations via material implication , 2010, Nature.

[3]  Jefersson Alex dos Santos,et al.  Towards better exploiting convolutional neural networks for remote sensing scene classification , 2016, Pattern Recognit..

[4]  H.-S. Philip Wong,et al.  In-memory computing with resistive switching devices , 2018, Nature Electronics.

[5]  Lei Xu,et al.  16 Boolean logics in three steps with two anti-serially connected memristors , 2015 .

[6]  J.F. Kang,et al.  RRAM based convolutional neural networks for high accuracy pattern recognition and online learning tasks , 2017, 2017 Silicon Nanoelectronics Workshop (SNW).

[7]  L. Goux,et al.  Intrinsic switching variability in HfO2 RRAM , 2013, 2013 5th IEEE International Memory Workshop.

[8]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[9]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[10]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Sparsh Mittal,et al.  A Survey of ReRAM-Based Architectures for Processing-In-Memory and Neural Networks , 2018, Mach. Learn. Knowl. Extr..