A surprisingly effective out-of-the-box char2char model on the E2E NLG Challenge dataset
暂无分享,去创建一个
[1] José A. R. Fonollosa,et al. Character-based Neural Machine Translation , 2016, ACL.
[2] David Vandyke,et al. Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems , 2015, EMNLP.
[3] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[4] Martín Abadi,et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.
[5] Hang Li,et al. “ Tony ” DNN Embedding for “ Tony ” Selective Read for “ Tony ” ( a ) Attention-based Encoder-Decoder ( RNNSearch ) ( c ) State Update s 4 SourceVocabulary Softmax Prob , 2016 .
[6] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[7] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[8] Quoc V. Le,et al. Massive Exploration of Neural Machine Translation Architectures , 2017, EMNLP.
[9] Oliver Lemon,et al. Crowd-sourcing NLG Data: Pictures Elicit Better Data. , 2016, INLG.
[10] Will Radford,et al. Learning to generate one-sentence biographies from Wikidata , 2017, EACL.
[11] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[12] Marc Dymetman,et al. Natural Language Generation through Character-based RNNs with Finite-state Prior Knowledge , 2016, COLING.
[13] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[14] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.