Deep-DFR: A Memristive Deep Delayed Feedback Reservoir Computing System with Hybrid Neural Network Topology

Deep neural networks (DNNs), the brain-like machine learning architecture, have gained immense success in data-extensive applications. In this work, a hybrid structured deep delayed feedback reservoir (Deep-DFR) computing model is proposed and fabricated. Our Deep-DFR employs memristive synapses working in a hierarchical information processing fashion with DFR modules as the readout layer, leading our proposed deep learning structure to be both depth-in-space and depth-in-time. Our fabricated prototype along with experimental results demonstrate its high energy efficiency with low hardware implementation cost. With applications on the image classification, MNIST and SVHN, our Deep-DFR yields a 1.26 $\sim$ 7.69X reduction on the testing error compared to state-of-the-art DNN designs.

[1]  K. Ikeda,et al.  High-dimensional chaotic behavior in systems with time-delayed feedback , 1987 .

[2]  Razvan Pascanu,et al.  How to Construct Deep Recurrent Neural Networks , 2013, ICLR.

[3]  Xin Fu,et al.  Interspike-Interval-Based Analog Spike-Time-Dependent Encoder for Neuromorphic Processors , 2017, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[4]  L. Appeltant,et al.  Information processing using a single dynamical node as complex system , 2011, Nature communications.

[5]  Yang Yi,et al.  Enabling An New Era of Brain-inspired Computing: Energy-efficient Spiking Neural Network with Ring Topology , 2018, 2018 55th ACM/ESDA/IEEE Design Automation Conference (DAC).

[6]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[7]  Yang Yi,et al.  DFR , 2018, ACM Journal on Emerging Technologies in Computing Systems.

[8]  Gu-Yeon Wei,et al.  14.3 A 28nm SoC with a 1.2GHz 568nJ/prediction sparse deep-neural-network engine with >0.1 timing error rate tolerance for IoT applications , 2017, 2017 IEEE International Solid-State Circuits Conference (ISSCC).

[9]  M. Alexander Nugent,et al.  The Generalized Metastable Switch Memristor Model , 2016, ArXiv.

[10]  Soumith Chintala,et al.  Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.

[11]  Jianhong Wu SYMMETRIC FUNCTIONAL DIFFERENTIAL EQUATIONS AND NEURAL NETWORKS WITH MEMORY , 1998 .

[12]  Zhiwei Li,et al.  Binary neural network with 16 Mb RRAM macro chip for classification and online training , 2016, 2016 IEEE International Electron Devices Meeting (IEDM).

[13]  Simon J. Thorpe,et al.  Combining STDP and Reward-Modulated STDP in Deep Convolutional Spiking Neural Networks for Digit Recognition , 2018, ArXiv.

[14]  Philip Heng Wai Leong,et al.  FINN: A Framework for Fast, Scalable Binarized Neural Network Inference , 2016, FPGA.

[15]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[16]  Andrew Y. Ng,et al.  Reading Digits in Natural Images with Unsupervised Feature Learning , 2011 .

[17]  Martin T. Hagan,et al.  Neural network design , 1995 .

[18]  Meng-Fan Chang,et al.  Low Store Energy, Low VDDmin, 8T2R Nonvolatile Latch and SRAM With Vertical-Stacked Resistive Memory (Memristor) Devices for Low Power Mobile Applications , 2012, IEEE Journal of Solid-State Circuits.

[19]  Minoru Asada,et al.  Information processing in echo state networks at the edge of chaos , 2011, Theory in Biosciences.

[20]  Shimeng Yu,et al.  Ferroelectric FET analog synapse for acceleration of deep neural network training , 2017, 2017 IEEE International Electron Devices Meeting (IEDM).

[21]  Garrison W. Cottrell,et al.  Deep-ESN: A Multiple Projection-encoding Hierarchical Reservoir Computing Framework , 2017, ArXiv.

[22]  Anne Beuter,et al.  Feedback and delays in neurological diseases: A modeling study using gynamical systems , 1993 .