Neural Machine Translation with Recurrent Attention Modeling
暂无分享,去创建一个
Alexander J. Smola | Zhiting Hu | Chris Dyer | Zichao Yang | Yuntian Deng | Alex Smola | Chris Dyer | Zhiting Hu | Zichao Yang | Yuntian Deng
[1] Yoshua Bengio,et al. A Character-level Decoder without Explicit Segmentation for Neural Machine Translation , 2016, ACL.
[2] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[3] Gholamreza Haffari,et al. Incorporating Structural Alignment Biases into an Attentional Neural Translation Model , 2016, NAACL.
[4] Geoffrey E. Hinton,et al. Grammar as a Foreign Language , 2014, NIPS.
[5] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[6] Shi Feng,et al. Improving Attention Modeling with Implicit Distortion and Fertility for Machine Translation , 2016, COLING.
[7] Yoshua Bengio,et al. On Using Very Large Target Vocabulary for Neural Machine Translation , 2014, ACL.
[8] Quoc V. Le,et al. Addressing the Rare Word Problem in Neural Machine Translation , 2014, ACL.
[9] Yang Liu,et al. Modeling Coverage for Neural Machine Translation , 2016, ACL.
[10] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[11] Shi Feng,et al. Implicit Distortion and Fertility Models for Attention-based Encoder-Decoder NMT Model , 2016, ArXiv.
[12] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[13] Phil Blunsom,et al. Recurrent Continuous Translation Models , 2013, EMNLP.
[14] José A. R. Fonollosa,et al. Character-based Neural Machine Translation , 2016, ACL.
[15] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.