Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task
暂无分享,去创建一个
[1] Philipp Koehn,et al. Six Challenges for Neural Machine Translation , 2017, NMT@ACL.
[2] Eneko Agirre,et al. Unsupervised Neural Machine Translation , 2017, ICLR.
[3] Kevin Knight,et al. Multi-Source Neural Translation , 2016, NAACL.
[4] Guillaume Lample,et al. Unsupervised Machine Translation Using Monolingual Corpora Only , 2017, ICLR.
[5] Geoffrey E. Hinton,et al. Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.
[6] Yann Dauphin,et al. Convolutional Sequence to Sequence Learning , 2017, ICML.
[7] Martin Wattenberg,et al. Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation , 2016, TACL.
[8] Alexander M. Rush,et al. OpenNMT: Open-Source Toolkit for Neural Machine Translation , 2017, ACL.
[9] Marta R. Costa-jussà,et al. Experimental Research on Encoder-Decoder Architectures with Attention for Chatbots , 2018, Computación y Sistemas.
[10] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[11] Lukasz Kaiser,et al. One Model To Learn Them All , 2017, ArXiv.
[12] Marta R. Costa-jussà,et al. End-to-End Speech Translation with the Transformer , 2018, IberSPEECH.
[13] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[14] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.