Ablations over transformer models for biomedical relationship extraction
暂无分享,去创建一个
Richard J. Jackson | Martin Johansson | Erik Jansson | Elliot Ford | Mats Axelsson | Eliseo Papa | Aron Lagerberg | Vladimir Poroshin | Timothy Scrivener | Lesly Arun Franco | Eliseo Papa | Richard Jackson | Aron Lagerberg | V. Poroshin | Erik Jansson | Timothy Scrivener | Elliot Ford | Mats Axelsson | Martin Johansson
[1] Yifan He,et al. Enriching Pre-trained Language Model with Entity Information for Relation Classification , 2019, CIKM.
[2] Zhiyong Lu,et al. Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets , 2019, BioNLP@ACL.
[3] Zhiyong Lu,et al. Community challenges in biomedical text mining over 10 years: success, failure and the future , 2016, Briefings Bioinform..
[4] Jaewoo Kang,et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining , 2019, Bioinform..
[5] Hung-Yu Kao,et al. Probing Neural Network Comprehension of Natural Language Arguments , 2019, ACL.
[6] Jaewoo Kang,et al. Chemical–gene relation extraction using recursive neural network , 2018, Database J. Biol. Databases Curation.
[7] Herrero-ZazoMaría,et al. The DDI corpus , 2013 .
[8] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[9] Paloma Martínez,et al. The DDI corpus: An annotated corpus with pharmacological substances and drug-drug interactions , 2013, J. Biomed. Informatics.
[10] Sameer Singh,et al. Universal Adversarial Triggers for Attacking and Analyzing NLP , 2019, EMNLP.
[11] Dipanjan Das,et al. BERT Rediscovers the Classical NLP Pipeline , 2019, ACL.