Empirical Study of Data-Free Iterative Knowledge Distillation
暂无分享,去创建一个
Ashwin Srinivasan | Tirtharaj Dash | Ramya Hebbalaguppe | Ashwin Vaswani | Het Shah | A. Srinivasan | R. Hebbalaguppe | T. Dash | Het Shah | Ashwin Vaswani | A. Vaswani
[1] Carlos G'omez-Rodr'iguez,et al. Distilling Neural Networks for Greener and Faster Dependency Parsing , 2020, IWPT.
[2] Xu Tan,et al. Progressive Blockwise Knowledge Distillation for Neural Network Acceleration , 2018, IJCAI.
[3] Kaiming He,et al. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.