Knowledge Distillation with Teacher Multi-task Model for Biomedical Named Entity Recognition
暂无分享,去创建一个
Ivan Serina | Alfonso Gerevini | Alberto Lavelli | Tahir Mehmood | A. Gerevini | I. Serina | T. Mehmood | A. Lavelli
[1] Sampo Pyysalo,et al. A neural network multi-task learning approach to biomedical named entity recognition , 2017, BMC Bioinformatics.
[2] Ivan Serina,et al. The Impact of Self-Interaction Attention on the Extraction of Drug-Drug Interactions , 2019, CLiC-it.
[3] Wei Xu,et al. Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation , 2016, TACL.
[4] Ivan Serina,et al. Knowledge Distillation Techniques for Biomedical Named Entity Recognition , 2020, NL4AI@AI*IA.
[5] Andrew McCallum,et al. Ask the GRU: Multi-task Learning for Deep Text Recommendations , 2016, RecSys.
[6] Kewei Tu,et al. Structure-Level Knowledge Distillation For Multilingual Sequence Labeling , 2020, ACL.
[7] Ivan Serina,et al. Applying Self-interaction Attention for Extracting Drug-Drug Interactions , 2019, AI*IA.
[8] Yu Zhang,et al. Cross-type Biomedical Named Entity Recognition with Deep Multi-Task Learning , 2018, bioRxiv.
[9] Jimmy J. Lin,et al. Distilling Task-Specific Knowledge from BERT into Simple Neural Networks , 2019, ArXiv.
[10] Gary D. Bader,et al. Transfer learning for biomedical named entity recognition with neural networks , 2018, bioRxiv.
[11] Mourad Gridach,et al. Character-level neural network for biomedical named entity recognition , 2017, J. Biomed. Informatics.
[12] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[13] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[14] Ivan Serina,et al. Leveraging Multi-task Learning for Biomedical Named Entity Recognition , 2019, AI*IA.