Multilingual Translation from Denoising Pre-Training
暂无分享,去创建一个
Vishrav Chaudhary | Angela Fan | Naman Goyal | Chau Tran | Peng-Jen Chen | Xian Li | Jiatao Gu | Yuqing Tang | Naman Goyal | Angela Fan | Vishrav Chaudhary | C. Tran | Jiatao Gu | Peng-Jen Chen | Y. Tang | Xian Li
[1] Dianhai Yu,et al. Multi-Task Learning for Multiple Language Translation , 2015, ACL.
[2] Philipp Koehn,et al. Findings of the 2014 Workshop on Statistical Machine Translation , 2014, WMT@ACL.
[3] Ankur Bapna,et al. Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges , 2019, ArXiv.
[4] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[5] Xu Tan,et al. MASS: Masked Sequence to Sequence Pre-training for Language Generation , 2019, ICML.
[6] Yuqing Tang,et al. Cross-lingual Retrieval for Iterative Self-Supervised Training , 2020, NeurIPS.
[7] Marta R. Costa-jussà,et al. Findings of the 2019 Conference on Machine Translation (WMT19) , 2019, WMT.
[8] Miguel Ballesteros,et al. Multilingual Neural Machine Translation with Task-Specific Attention , 2018, COLING.
[9] Graham Neubig,et al. XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization , 2020, ICML.
[10] Omer Levy,et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.
[11] Philipp Koehn,et al. Findings of the 2017 Conference on Machine Translation (WMT17) , 2017, WMT.
[12] Philipp Koehn,et al. Findings of the 2013 Workshop on Statistical Machine Translation , 2013, WMT@ACL.
[13] Orhan Firat,et al. Massively Multilingual Neural Machine Translation , 2019, NAACL.
[14] Marcello Federico,et al. Multilingual Neural Machine Translation for Low Resource Languages , 2017, CLiC-it.
[15] Laura Martinus,et al. Benchmarking Neural Machine Translation for Southern African Languages , 2019, WNLP@ACL.
[16] Philipp Koehn,et al. Findings of the 2018 Conference on Machine Translation (WMT18) , 2018, WMT.
[17] Karin M. Verspoor,et al. Findings of the 2016 Conference on Machine Translation , 2016, WMT.
[18] Tom M. Mitchell,et al. Contextual Parameter Generation for Universal Neural Machine Translation , 2018, EMNLP.
[19] Martin Wattenberg,et al. Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation , 2016, TACL.
[20] Yong Wang,et al. Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations , 2019, ACL.
[21] Jörg Tiedemann,et al. Parallel Data, Tools and Interfaces in OPUS , 2012, LREC.
[22] Marjan Ghazvininejad,et al. Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation , 2020, ArXiv.
[23] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[24] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[25] Ankur Bapna,et al. Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation , 2020, ACL.
[26] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[27] Graham Neubig,et al. When and Why Are Pre-Trained Word Embeddings Useful for Neural Machine Translation? , 2018, NAACL.
[28] Jan Niehues,et al. Toward Multilingual Neural Machine Translation with Universal Encoder and Decoder , 2016, IWSLT.
[29] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[30] Philipp Koehn,et al. Two New Evaluation Datasets for Low-Resource Machine Translation: Nepali-English and Sinhala-English , 2019, ArXiv.
[31] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[32] Guillaume Lample,et al. XNLI: Evaluating Cross-lingual Sentence Representations , 2018, EMNLP.
[33] Myle Ott,et al. fairseq: A Fast, Extensible Toolkit for Sequence Modeling , 2019, NAACL.
[34] Marjan Ghazvininejad,et al. Multilingual Denoising Pre-training for Neural Machine Translation , 2020, Transactions of the Association for Computational Linguistics.
[35] Tao Qin,et al. Multilingual Neural Machine Translation with Language Clustering , 2019, EMNLP.
[36] Raj Dabre,et al. Exploiting Multilingualism through Multistage Fine-Tuning for Low-Resource Neural Machine Translation , 2019, EMNLP.
[37] Philipp Koehn,et al. Findings of the 2020 Conference on Machine Translation (WMT20) , 2020, WMT.
[38] Yichao Lu,et al. A neural interlingua for multilingual machine translation , 2018, WMT.
[39] Tomas Mikolov,et al. Bag of Tricks for Efficient Text Classification , 2016, EACL.
[40] Graham Neubig,et al. Parameter Sharing Methods for Multilingual Self-Attentional Translation Models , 2018, WMT.
[41] Yoshua Bengio,et al. Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism , 2016, NAACL.
[42] Mauro Cettolo,et al. Overview of the IWSLT 2017 Evaluation Campaign , 2017, IWSLT.
[43] Christopher D. Manning,et al. Stanford Neural Machine Translation Systems for Spoken Language Domains , 2015, IWSLT.
[44] Holger Schwenk,et al. Beyond English-Centric Multilingual Machine Translation , 2020, J. Mach. Learn. Res..
[45] Sebastian Riedel,et al. MLQA: Evaluating Cross-lingual Extractive Question Answering , 2019, ACL.
[46] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[47] Rico Sennrich,et al. Improving Massively Multilingual Neural Machine Translation and Zero-Shot Translation , 2020, ACL.
[48] Taku Kudo,et al. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing , 2018, EMNLP.