Pruning CNN filters via quantifying the importance of deep visual representations

[1]  Yi Yang,et al.  Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks , 2018, IJCAI.

[2]  Klaus-Robert Müller,et al.  Pruning by Explaining: A Novel Criterion for Deep Neural Network Pruning , 2019, Pattern Recognit..

[3]  Xianghua Xie,et al.  Learning Discriminatory Deep Clustering Models , 2019, CAIP.

[4]  Matthieu Guillaumin,et al.  ImageNet Auto-Annotation with Segmentation Propagation , 2014, International Journal of Computer Vision.

[5]  Alexander Binder,et al.  On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation , 2015, PloS one.

[6]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[7]  Xianghua Xie,et al.  Minimum S-Excess Graph for Segmenting and Tracking Multiple Borders with HMM , 2015, MICCAI.

[8]  Xianghua Xie,et al.  Phase contrast cell detection using multilevel classification , 2018, International journal for numerical methods in biomedical engineering.

[9]  Xuelong Li,et al.  Towards Compact ConvNets via Structure-Sparsity Regularized Filter Pruning , 2019, ArXiv.

[10]  Tao Zhang,et al.  Model Compression and Acceleration for Deep Neural Networks: The Principles, Progress, and Challenges , 2018, IEEE Signal Processing Magazine.

[11]  M. Z. Rashad,et al.  Neuro-fuzzy patch-wise R-CNN for multiple sclerosis segmentation , 2020, Medical & Biological Engineering & Computing.

[12]  Jianxin Wu,et al.  AutoPruner: An End-to-End Trainable Filter Pruning Method for Efficient Deep Model Inference , 2018, Pattern Recognit..

[13]  Kai Chen,et al.  Compressing CNN-DBLSTM models for OCR with teacher-student learning and Tucker decomposition , 2019, Pattern Recognit..

[14]  Peter Stone,et al.  Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science , 2017, Nature Communications.

[15]  Rob Fergus,et al.  Visualizing and Understanding Convolutional Networks , 2013, ECCV.

[16]  Hao Zhou,et al.  Less Is More: Towards Compact CNNs , 2016, ECCV.

[17]  Jianxin Wu,et al.  ThiNet: Pruning CNN Filters for a Thinner Net , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Bolei Zhou,et al.  Interpretable Basis Decomposition for Visual Explanation , 2018, ECCV.

[19]  Michael S. Bernstein,et al.  ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.

[20]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[21]  Alexei A. Efros,et al.  Generative Visual Manipulation on the Natural Image Manifold , 2016, ECCV.

[22]  Huaming Wu,et al.  Channel pruning based on mean gradient for accelerating Convolutional Neural Networks , 2019, Signal Process..

[23]  Rongrong Ji,et al.  Holistic CNN Compression via Low-Rank Decomposition with Knowledge Transfer , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.