Coreference Resolution without Span Representations
暂无分享,去创建一个
Omer Levy | Ori Ram | Yuval Kirstain | Omer Levy | Ori Ram | Yuval Kirstain
[1] Yuchen Zhang,et al. CoNLL-2012 Shared Task: Modeling Multilingual Unrestricted Coreference in OntoNotes , 2012, EMNLP-CoNLL Shared Task.
[2] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[3] Zhen-Hua Ling,et al. Enhanced LSTM for Natural Language Inference , 2016, ACL.
[4] Arman Cohan,et al. Longformer: The Long-Document Transformer , 2020, ArXiv.
[5] Omer Levy,et al. SpanBERT: Improving Pre-training by Representing and Predicting Spans , 2019, TACL.
[6] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[7] Luke S. Zettlemoyer,et al. Higher-Order Coreference Resolution with Coarse-to-Fine Inference , 2018, NAACL.
[8] Liyan Xu,et al. Revealing the Myth of Higher-Order Inference in Coreference Resolution , 2020, EMNLP.
[9] Omer Levy,et al. BERT for Coreference Resolution: Baselines and Analysis , 2019, EMNLP/IJCNLP.
[10] Jiwei Li,et al. CorefQA: Coreference Resolution as Query-based Span Prediction , 2020, ACL.
[11] Amir Globerson,et al. Coreference Resolution with Entity Equalization , 2019, ACL.
[12] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[13] Luke S. Zettlemoyer,et al. End-to-end Neural Coreference Resolution , 2017, EMNLP.