Filter Level Pruning Based on Similar Feature Extraction for Convolutional Neural Networks
暂无分享,去创建一个
Yuhui Xu | Jie Zhu | Lianqiang Li | Jie Zhu | Lianqiang Li | Yuhui Xu
[1] Sergei Vassilvitskii,et al. k-means++: the advantages of careful seeding , 2007, SODA '07.
[2] Lin Xu,et al. Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights , 2017, ICLR.
[3] R. Venkatesh Babu,et al. Data-free Parameter Pruning for Deep Neural Networks , 2015, BMVC.
[4] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[5] Nobutaka Shimada,et al. Construction of Latent Descriptor Space and Inference Model of Hand-Object Interactions , 2017, IEICE Trans. Inf. Syst..
[6] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[7] Trevor Darrell,et al. Caffe: Convolutional Architecture for Fast Feature Embedding , 2014, ACM Multimedia.
[8] Rui Peng,et al. Network Trimming: A Data-Driven Neuron Pruning Approach towards Efficient Deep Architectures , 2016, ArXiv.
[9] Song Han,et al. Learning both Weights and Connections for Efficient Neural Network , 2015, NIPS.
[10] Hanan Samet,et al. Pruning Filters for Efficient ConvNets , 2016, ICLR.
[11] Taesik Na,et al. Design of an energy-efficient accelerator for training of convolutional neural networks using frequency-domain computation , 2017, 2017 54th ACM/EDAC/IEEE Design Automation Conference (DAC).