暂无分享,去创建一个
[1] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[2] Richard Socher,et al. Efficient and Robust Question Answering from Minimal Context over Documents , 2018, ACL.
[3] Wanxiang Che,et al. Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting , 2020, EMNLP.
[4] Fatma Oezdemir-Zaech,et al. Semantically Corroborating Neural Attention for Biomedical Question Answering , 2019, PKDD/ECML Workshops.
[5] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[6] William W. Cohen,et al. Probing Biomedical Embeddings from Language Models , 2019, Proceedings of the 3rd Workshop on Evaluating Vector Space Representations for.
[7] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[8] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.
[9] Jonathan Berant,et al. MultiQA: An Empirical Investigation of Generalization and Transfer in Reading Comprehension , 2019, ACL.
[10] Alexey Romanov,et al. Lessons from Natural Language Inference in the Clinical Domain , 2018, EMNLP.
[11] Sebastian Ruder,et al. Neural transfer learning for natural language processing , 2019 .
[12] Zhiyong Lu,et al. Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets , 2019, BioNLP@ACL.
[13] Alex Wang,et al. Probing What Different NLP Tasks Teach Machines about Function Word Comprehension , 2019, *SEMEVAL.
[14] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[15] Rui Yan,et al. How Transferable are Neural Networks in NLP Applications? , 2016, EMNLP.
[16] Subhransu Maji,et al. Exploring and Predicting Transferability across NLP Tasks , 2020, EMNLP.
[17] Yonatan Belinkov,et al. Linguistic Knowledge and Transferability of Contextual Representations , 2019, NAACL.
[18] Georgios Balikas,et al. An overview of the BIOASQ large-scale biomedical semantic indexing and question answering competition , 2015, BMC Bioinformatics.
[19] Ryan T. McDonald,et al. Measuring Domain Portability and ErrorPropagation in Biomedical QA , 2019, PKDD/ECML Workshops.
[20] Mariana L. Neves,et al. Neural Domain Adaptation for Biomedical Question Answering , 2017, CoNLL.
[21] Yoshua Bengio,et al. How transferable are features in deep neural networks? , 2014, NIPS.
[22] Iz Beltagy,et al. SciBERT: A Pretrained Language Model for Scientific Text , 2019, EMNLP.
[23] Yonghwa Choi,et al. A Neural Named Entity Recognition and Multi-Type Normalization Tool for Biomedical Text Mining , 2019, IEEE Access.
[24] Giuseppe Attardi,et al. Transformer Models for Question Answering at BioASQ 2019 , 2019, PKDD/ECML Workshops.
[25] Ming-Wei Chang,et al. BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions , 2019, NAACL.
[26] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[27] Hector J. Levesque,et al. The Winograd Schema Challenge , 2011, AAAI Spring Symposium: Logical Formalizations of Commonsense Reasoning.
[28] Jaewoo Kang,et al. Pre-trained Language Model for Biomedical Question Answering , 2019, PKDD/ECML Workshops.
[29] Grigorios Tsoumakas,et al. Yes/No Question Answering in BioASQ 2019 , 2019, PKDD/ECML Workshops.
[30] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[31] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[32] Michael I. Jordan,et al. Learning Transferable Features with Deep Adaptation Networks , 2015, ICML.
[33] Jaewoo Kang,et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining , 2019, Bioinform..
[34] Wlodek Zadrozny,et al. UNCC Biomedical Semantic Question Answering Systems. BioASQ: Task-7B, Phase-B , 2019, PKDD/ECML Workshops.
[35] Wei-Hung Weng,et al. Publicly Available Clinical BERT Embeddings , 2019, Proceedings of the 2nd Clinical Natural Language Processing Workshop.
[36] Samuel R. Bowman,et al. Sentence Encoders on STILTs: Supplementary Training on Intermediate Labeled-data Tasks , 2018, ArXiv.