Unsupervised Cross-lingual Representation Learning at Scale
暂无分享,去创建一个
Veselin Stoyanov | Vishrav Chaudhary | Guillaume Wenzek | Myle Ott | Edouard Grave | Luke Zettlemoyer | Naman Goyal | Alexis Conneau | Kartikay Khandelwal | Francisco Guzm'an | Myle Ott | Naman Goyal | Luke Zettlemoyer | Veselin Stoyanov | A. Conneau | Edouard Grave | Vishrav Chaudhary | Francisco Guzmán | Kartikay Khandelwal | Guillaume Wenzek
[1] Erik F. Tjong Kim Sang,et al. Introduction to the CoNLL-2002 Shared Task: Language-Independent Named Entity Recognition , 2002, CoNLL.
[2] Erik F. Tjong Kim Sang,et al. Introduction to the CoNLL-2003 Shared Task: Language-Independent Named Entity Recognition , 2003, CoNLL.
[3] Quoc V. Le,et al. Exploiting Similarities among Languages for Machine Translation , 2013, ArXiv.
[4] Christopher Potts,et al. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank , 2013, EMNLP.
[5] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[6] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[7] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[8] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[9] Guillaume Lample,et al. Neural Architectures for Named Entity Recognition , 2016, NAACL.
[10] Yonghui Wu,et al. Exploring the Limits of Language Modeling , 2016, ArXiv.
[11] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[12] Tomas Mikolov,et al. Bag of Tricks for Efficient Text Classification , 2016, EACL.
[13] Moustapha Cissé,et al. Efficient softmax approximation for GPUs , 2016, ICML.
[14] Martin Wattenberg,et al. Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation , 2016, TACL.
[15] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[16] Roland Vollgraf,et al. Contextual String Embeddings for Sequence Labeling , 2018, COLING.
[17] Percy Liang,et al. Know What You Don’t Know: Unanswerable Questions for SQuAD , 2018, ACL.
[18] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[19] Omer Levy,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.
[20] Prakhar Gupta,et al. Learning Word Vectors for 157 Languages , 2018, LREC.
[21] Taku Kudo,et al. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing , 2018, EMNLP.
[22] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[23] Taku Kudo,et al. Subword Regularization: Improving Neural Network Translation Models with Multiple Subword Candidates , 2018, ACL.
[24] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[25] Di He,et al. Multilingual Neural Machine Translation with Knowledge Distillation , 2019, ICLR.
[26] Mark Dredze,et al. Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT , 2019, EMNLP.
[27] Ming Zhou,et al. Unicoder: A Universal Language Encoder by Pre-training with Multiple Cross-lingual Tasks , 2019, EMNLP.
[28] Eva Schlinger,et al. How Multilingual is Multilingual BERT? , 2019, ACL.
[29] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[30] Ankur Bapna,et al. Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges , 2019, ArXiv.
[31] Richard Socher,et al. XLDA: Cross-Lingual Data Augmentation for Natural Language Inference and Question Answering , 2019, ArXiv.
[32] Alexei Baevski,et al. Adaptive Input Representations for Neural Language Modeling , 2018, ICLR.
[33] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[34] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[35] Regina Barzilay,et al. Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing , 2019, NAACL.
[36] Quoc V. Le,et al. Unsupervised Data Augmentation for Consistency Training , 2019, NeurIPS.
[37] Luke Zettlemoyer,et al. Emerging Cross-lingual Structure in Pretrained Language Models , 2019, ACL.
[38] Holger Schwenk,et al. MLQA: Evaluating Cross-lingual Extractive Question Answering , 2019, ACL.
[39] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[40] Orhan Firat,et al. Evaluating the Cross-Lingual Effectiveness of Massively Multilingual Neural Machine Translation , 2019, AAAI Conference on Artificial Intelligence.
[41] Vishrav Chaudhary,et al. CCNet: Extracting High Quality Monolingual Datasets from Web Crawl Data , 2019, LREC.