暂无分享,去创建一个
[1] Taku Kudo,et al. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing , 2018, EMNLP.
[2] Daisuke Kawahara,et al. Juman++: A Morphological Analysis Toolkit for Scriptio Continua , 2018, EMNLP.
[3] Kevin Knight,et al. Out-of-the-box Universal Romanization Tool uroman , 2018, ACL.
[4] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[5] Deniz Yuret,et al. Transfer Learning for Low-Resource Neural Machine Translation , 2016, EMNLP.
[6] Tetsuji Nakagawa,et al. An Empirical Study of Language Relatedness for Transfer Learning in Neural Machine Translation , 2017, PACLIC.
[7] Pushpak Bhattacharyya,et al. Addressing word-order Divergence in Multilingual Neural Machine Translation for extremely Low Resource Languages , 2018, NAACL.
[8] Jianfeng Gao,et al. Domain Adaptation via Pseudo In-Domain Data Selection , 2011, EMNLP.
[9] Graham Neubig,et al. When and Why Are Pre-Trained Word Embeddings Useful for Neural Machine Translation? , 2018, NAACL.
[10] Graham Neubig,et al. Target Conditioned Sampling: Optimizing Data Selection for Multilingual Neural Machine Translation , 2019, ACL.
[11] Rico Sennrich,et al. Improving Neural Machine Translation Models with Monolingual Data , 2015, ACL.
[12] Raj Dabre,et al. Exploiting Multilingualism through Multistage Fine-Tuning for Low-Resource Neural Machine Translation , 2019, EMNLP.
[13] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[14] Philipp Koehn,et al. Two New Evaluation Datasets for Low-Resource Machine Translation: Nepali-English and Sinhala-English , 2019, ArXiv.
[15] Kenneth Heafield,et al. KenLM: Faster and Smaller Language Model Queries , 2011, WMT@EMNLP.
[16] Xu Tan,et al. MASS: Masked Sequence to Sequence Pre-training for Language Generation , 2019, ICML.
[17] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[18] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.