暂无分享,去创建一个
[1] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[2] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[3] Marcin Junczys-Dowmunt,et al. Dual Conditional Cross-Entropy Filtering of Noisy Parallel Corpora , 2018, WMT.
[4] Myle Ott,et al. Understanding Back-Translation at Scale , 2018, EMNLP.
[5] Yoshua Bengio,et al. Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.
[6] Matt Post,et al. A Call for Clarity in Reporting BLEU Scores , 2018, WMT.
[7] Marta R. Costa-jussà,et al. Findings of the 2019 Conference on Machine Translation (WMT19) , 2019, WMT.
[8] Jörg Tiedemann,et al. Neural Machine Translation with Extended Context , 2017, DiscoMT@EMNLP.
[9] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[10] Marcin Junczys-Dowmunt,et al. MS-UEdin Submission to the WMT2018 APE Shared Task: Dual-Source Transformer for Automatic Post-Editing , 2018, WMT.
[11] Taku Kudo,et al. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing , 2018, EMNLP.
[12] Marcin Junczys-Dowmunt,et al. Microsoft’s Submission to the WMT2018 News Translation Task: How I Learned to Stop Worrying and Love the Data , 2018, WMT.