暂无分享,去创建一个
Kaushik Roy | Nitin Rathi | K. Roy | Nitin Rathi | K. Roy
[1] Timothy P Lillicrap,et al. Deep Learning with Dynamic Spiking Neurons and Fixed Feedback Weights , 2017, Neural Computation.
[2] Terrence J. Sejnowski,et al. Gradient Descent for Spiking Neural Networks , 2017, NeurIPS.
[3] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[4] Jim D. Garside,et al. SpiNNaker: A 1-W 18-Core System-on-Chip for Massively-Parallel Neural Network Simulation , 2013, IEEE Journal of Solid-State Circuits.
[5] Firas AlBalas,et al. A comparative study on spiking neural network encoding schema: implemented with cloud computing , 2018, Cluster Computing.
[6] Emre Neftci,et al. Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks , 2019, IEEE Signal Processing Magazine.
[7] Tara N. Sainath,et al. Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.
[8] Kaushik Roy,et al. RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Network , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[9] Deepak Khosla,et al. Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition , 2014, International Journal of Computer Vision.
[10] Garrick Orchard,et al. SLAYER: Spike Layer Error Reassignment in Time , 2018, NeurIPS.
[11] Wei Fang,et al. Leaky Integrate-and-Fire Spiking Neuron with Learnable Membrane Time Parameter , 2020, ArXiv.
[12] Kaushik Roy,et al. Spike-FlowNet: Event-based Optical Flow Estimation with Energy-Efficient Hybrid Neural Networks , 2020, ECCV.
[13] Federico Corradi,et al. Effective and Efficient Computation with Multiple-timescale Spiking Recurrent Neural Networks , 2020, ICONS.
[14] Li Fei-Fei,et al. ImageNet: A large-scale hierarchical image database , 2009, CVPR.
[15] Colin J. Akerman,et al. Random synaptic feedback weights support error backpropagation for deep learning , 2016, Nature Communications.
[16] Xinbo Chen,et al. Evaluating the Energy Efficiency of Deep Convolutional Neural Networks on CPUs and GPUs , 2016, 2016 IEEE International Conferences on Big Data and Cloud Computing (BDCloud), Social Computing and Networking (SocialCom), Sustainable Computing and Communications (SustainCom) (BDCloud-SocialCom-SustainCom).
[17] Andrew Zisserman,et al. Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.
[18] Song Han,et al. Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding , 2015, ICLR.
[19] Jyrki Alakuijala,et al. Temporal Coding in Spiking Neural Networks with Alpha Synaptic Function , 2019, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[20] Saeed Reza Kheradpisheh,et al. S4NN: temporal backpropagation for spiking neural networks with one spike per neuron , 2020, Int. J. Neural Syst..
[21] Song Han,et al. AMC: AutoML for Model Compression and Acceleration on Mobile Devices , 2018, ECCV.
[22] Ali Farhadi,et al. XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks , 2016, ECCV.
[23] Michael Ferdman,et al. Escher: A CNN Accelerator with Flexible Buffering to Minimize Off-Chip Transfer , 2017, 2017 IEEE 25th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM).
[24] Kaushik Roy,et al. Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures , 2019 .
[25] Garrick Orchard,et al. Neuromorphic Nearest Neighbor Search Using Intel's Pohoiki Springs , 2020, NICE.
[26] Kay Chen Tan,et al. A Tandem Learning Rule for Efficient and Rapid Inference on Deep Spiking Neural Networks. , 2019 .
[27] Tianqi Chen,et al. Training Deep Nets with Sublinear Memory Cost , 2016, ArXiv.
[28] Robert A. Legenstein,et al. Long short-term memory and Learning-to-learn in networks of spiking neurons , 2018, NeurIPS.
[29] Lei Deng,et al. Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks , 2017, Front. Neurosci..
[30] Shih-Chii Liu,et al. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification , 2017, Front. Neurosci..
[31] Sander M. Bohte,et al. SpikeProp: backpropagation for networks of spiking neurons , 2000, ESANN.
[32] Joel Emer,et al. Eyeriss: a spatial architecture for energy-efficient dataflow for convolutional neural networks , 2016, CARN.
[33] Mark Horowitz,et al. 1.1 Computing's energy problem (and what we can do about it) , 2014, 2014 IEEE International Solid-State Circuits Conference Digest of Technical Papers (ISSCC).
[34] Sen Lu,et al. Exploring the Connection Between Binary and Spiking Neural Networks , 2020, Frontiers in Neuroscience.
[35] Kaushik Roy,et al. Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures , 2019, Frontiers in Neuroscience.
[36] Kaushik Roy,et al. Constructing energy-efficient mixed-precision neural networks through principal component analysis for edge intelligence , 2019, Nat. Mach. Intell..
[37] Kaushik Roy,et al. Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..
[38] Peng Li,et al. Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks , 2020, NeurIPS.
[39] Kaushik Roy,et al. Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation , 2020, ICLR.
[40] Matthew Cook,et al. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).
[41] Hong Wang,et al. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.
[42] Wulfram Gerstner,et al. SPIKING NEURON MODELS Single Neurons , Populations , Plasticity , 2002 .
[43] Lei Deng,et al. Direct Training for Spiking Neural Networks: Faster, Larger, Better , 2018, AAAI.