暂无分享,去创建一个
Shujian Huang | Weihua Luo | Rongxiang Weng | Heng Yu | Shanbo Cheng | Shujian Huang | Weihua Luo | Heng Yu | Rongxiang Weng | Shanbo Cheng
[1] Yoshua Bengio,et al. On Using Monolingual Corpora in Neural Machine Translation , 2015, ArXiv.
[2] Quoc V. Le,et al. Unsupervised Pretraining for Sequence to Sequence Learning , 2016, EMNLP.
[3] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[4] Guodong Zhou,et al. Modeling Coherence for Neural Machine Translation with Dynamic and Topic Caches , 2017, COLING.
[5] Jiajun Zhang,et al. Exploiting Source-side Monolingual Data in Neural Machine Translation , 2016, EMNLP.
[6] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[7] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[8] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[9] Lei Li,et al. Towards Making the Most of BERT in Neural Machine Translation , 2020, AAAI.
[10] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[11] Andy Way,et al. Investigating Backtranslation in Neural Machine Translation , 2018, EAMT.
[12] Yann Dauphin,et al. A Convolutional Encoder Model for Neural Machine Translation , 2016, ACL.
[13] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[14] Jason Lee,et al. Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement , 2018, EMNLP.
[15] Meng Sun,et al. Baidu Neural Machine Translation Systems for WMT19 , 2019, WMT.
[16] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[17] Rico Sennrich,et al. Improving Neural Machine Translation Models with Monolingual Data , 2015, ACL.
[18] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[19] Lijun Wu,et al. Beyond Error Propagation in Neural Machine Translation: Characteristics of Language Also Matter , 2018, EMNLP.
[20] Peng Wu,et al. Learning Representation Mapping for Relation Detection in Knowledge Base Question Answering , 2019, ACL.
[21] Zhaopeng Tu,et al. Dynamic Past and Future for Neural Machine Translation , 2019, EMNLP.
[22] Marcello Federico,et al. Can Monolingual Embeddings Improve Neural Machine Translation? , 2017, CLiC-it.
[23] Kenneth Heafield,et al. Copied Monolingual Data Improves Low-Resource Neural Machine Translation , 2017, WMT.
[24] Quoc V. Le,et al. Exploiting Similarities among Languages for Machine Translation , 2013, ArXiv.
[25] Di He,et al. Layer-Wise Coordination between Encoder and Decoder for Neural Machine Translation , 2018, NeurIPS.
[26] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[27] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[28] Xu Tan,et al. MASS: Masked Sequence to Sequence Pre-training for Language Generation , 2019, ICML.
[29] Jingbo Zhu,et al. Multi-layer Representation Fusion for Neural Machine Translation , 2018, COLING.
[30] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[31] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[32] Shuming Shi,et al. Exploiting Deep Representations for Neural Machine Translation , 2018, EMNLP.
[33] Kai Song,et al. Alibaba’s Neural Machine Translation Systems for WMT18 , 2018, WMT.
[34] Enhong Chen,et al. Joint Training for Neural Machine Translation Models with Monolingual Data , 2018, AAAI.
[35] Yang Liu,et al. A Teacher-Student Framework for Zero-Resource Neural Machine Translation , 2017, ACL.
[36] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[37] Geoffrey E. Hinton,et al. Layer Normalization , 2016, ArXiv.