Feedforward sequential memory networks based encoder-decoder model for machine translation
暂无分享,去创建一个
Shiliang Zhang | Li-Rong Dai | Hui Jiang | Junfeng Hou | Lirong Dai | Hui Jiang | Shiliang Zhang | Junfeng Hou
[1] Gemma Boleda,et al. Convolutional Neural Network Language Models , 2016, EMNLP.
[2] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[3] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[4] Samy Bengio,et al. Show and tell: A neural image caption generator , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[5] Shiliang Zhang,et al. Compact Feedforward Sequential Memory Networks for Large Vocabulary Continuous Speech Recognition , 2016, INTERSPEECH.
[6] Yu Hu,et al. Nonrecurrent Neural Structure for Long-Term Dependence , 2017, IEEE/ACM Transactions on Audio, Speech, and Language Processing.
[7] Yoshua Bengio,et al. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention , 2015, ICML.
[8] Quoc V. Le,et al. Listen, attend and spell: A neural network for large vocabulary conversational speech recognition , 2015, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[9] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[10] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[11] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[12] Qun Liu,et al. Encoding Source Language with Convolutional Neural Network for Machine Translation , 2015, ACL.
[13] Jürgen Schmidhuber,et al. Learning to forget: continual prediction with LSTM , 1999 .
[14] Razvan Pascanu,et al. Theano: A CPU and GPU Math Compiler in Python , 2010, SciPy.
[15] Yoshua Bengio,et al. Attention-Based Models for Speech Recognition , 2015, NIPS.
[16] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[17] Yu Hu,et al. Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency , 2015, ArXiv.
[18] Yann Dauphin,et al. Convolutional Sequence to Sequence Learning , 2017, ICML.
[19] Yann Dauphin,et al. A Convolutional Encoder Model for Neural Machine Translation , 2016, ACL.