Knowledge Distillation with Attention for Deep Transfer Learning of Convolutional Networks
暂无分享,去创建一个
Dejing Dou | Haoyi Xiong | Ji Liu | Cheng-Zhong Xu | Zeyu Chen | Jun Huan | Xingjian Li | D. Dou | Chengzhong Xu | Jun Huan | H. Xiong | Ji Liu | Xingjian Li | Zeyu Chen
[1] Rongrong Ji,et al. Holistic CNN Compression via Low-Rank Decomposition with Knowledge Transfer , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[2] Bo Zhao,et al. Diversified Visual Attention Networks for Fine-Grained Object Classification , 2016, IEEE Transactions on Multimedia.
[3] Yongdong Zhang,et al. STAT: Spatial-Temporal Attention Mechanism for Video Captioning , 2020, IEEE Transactions on Multimedia.
[4] Rich Caruana,et al. Model compression , 2006, KDD '06.
[5] Yee Whye Teh,et al. A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.
[6] Xiangui Kang,et al. Audio Recapture Detection With Convolutional Neural Networks , 2016, IEEE Transactions on Multimedia.
[7] Yu Zhang,et al. Parameter Transfer Unit for Deep Neural Networks , 2018, PAKDD.
[8] Ronald A. Rensink. The Dynamic Representation of Scenes , 2000 .
[9] Zhanxing Zhu,et al. Towards Making Deep Transfer Learning Never Hurt , 2019, 2019 IEEE International Conference on Data Mining (ICDM).