暂无分享,去创建一个
Alexander M. Rush | Jean Senellart | Yuntian Deng | Guillaume Klein | Yoon Kim | Yuntian Deng | Jean Senellart | Yoon Kim | Guillaume Klein
[1] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[2] Samy Bengio,et al. Generating Sentences from a Continuous Space , 2015, CoNLL.
[3] Diyi Yang,et al. Hierarchical Attention Networks for Document Classification , 2016, NAACL.
[4] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[5] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[6] Jason Weston,et al. Memory Networks , 2014, ICLR.
[7] Marc'Aurelio Ranzato,et al. Large Scale Distributed Deep Networks , 2012, NIPS.
[8] Quoc V. Le,et al. A Neural Conversational Model , 2015, ArXiv.
[9] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[10] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[11] Quoc V. Le,et al. Listen, Attend and Spell , 2015, ArXiv.
[12] Alexander M. Rush,et al. What You Get Is What You See: A Visual Markup Decompiler , 2016, ArXiv.
[13] Alexander M. Rush,et al. Abstractive Sentence Summarization with Attentive Recurrent Neural Networks , 2016, NAACL.
[14] Philipp Koehn,et al. Moses: Open Source Toolkit for Statistical Machine Translation , 2007, ACL.
[15] Bo Wang,et al. SYSTRAN's Pure Neural Machine Translation Systems , 2016, ArXiv.
[16] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[17] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[18] Vladimir Eidelman,et al. cdec: A Decoder, Alignment, and Learning Framework for Finite- State and Context-Free Translation Models , 2010, ACL.
[19] Rico Sennrich,et al. Linguistic Input Features Improve Neural Machine Translation , 2016, WMT.
[20] Ramón Fernández Astudillo,et al. From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label Classification , 2016, ICML.
[21] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[22] Graham Neubig,et al. Neural Machine Translation and Sequence-to-sequence Models: A Tutorial , 2017, ArXiv.
[23] Graham Neubig,et al. Travatar: A Forest-to-String Machine Translation Engine based on Tree Transducers , 2013, ACL.
[24] Martin Wattenberg,et al. Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation , 2016, TACL.
[25] Yang Wang,et al. rnn : Recurrent Library for Torch , 2015, ArXiv.
[26] Yoshua Bengio,et al. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention , 2015, ICML.
[27] Bowen Zhou,et al. Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond , 2016, CoNLL.