暂无分享,去创建一个
[1] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[2] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[3] Iz Beltagy,et al. SciBERT: A Pretrained Language Model for Scientific Text , 2019, EMNLP.
[4] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[5] Hongfang Liu,et al. MedSTS: a resource for clinical semantic textual similarity , 2018, Language Resources and Evaluation.
[6] Alexey Romanov,et al. Lessons from Natural Language Inference in the Clinical Domain , 2018, EMNLP.
[7] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[8] Jaewoo Kang,et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining , 2019, Bioinform..
[9] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[10] Walter Daelemans,et al. Scalable Few-Shot Learning of Robust Biomedical Name Representations , 2021, BIONLP.
[11] Kexin Huang,et al. Clinical XLNet: Modeling Sequential Clinical Notes and Predicting Prolonged Mechanical Ventilation , 2019, CLINICALNLP.
[12] Barbara Rosario,et al. Classifying Semantic Relations in Bioscience Texts , 2004, ACL.