暂无分享,去创建一个
[1] Omer Levy,et al. SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems , 2019, NeurIPS.
[2] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[3] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[4] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[5] Roberto Navigli,et al. Word sense disambiguation: A survey , 2009, CSUR.
[6] Tomas Mikolov,et al. Enriching Word Vectors with Subword Information , 2016, TACL.
[7] José Camacho-Collados,et al. WiC: the Word-in-Context Dataset for Evaluating Context-Sensitive Meaning Representations , 2018, NAACL.
[8] Lysandre Debut,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[9] Federico Martelli,et al. SemEval-2021 Task 2: Multilingual and Cross-lingual Word-in-Context Disambiguation (MCL-WiC) , 2021, SEMEVAL.
[10] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[11] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[12] Sebastian Ruder,et al. Fine-tuned Language Models for Text Classification , 2018, ArXiv.