Debugging Translations of Transformer-based Neural Machine Translation Systems
暂无分享,去创建一个
[1] Mark Fishel,et al. Visualizing Neural Machine Translation Attention and Confidence , 2017, Prague Bull. Math. Linguistics.
[2] Tao Qin,et al. Incorporating BERT into Neural Machine Translation , 2020, ICLR.
[3] Noah A. Smith,et al. A Simple, Fast, and Effective Reparameterization of IBM Model 2 , 2013, NAACL.
[4] Lemao Liu,et al. Neural Machine Translation with Supervised Attention , 2016, COLING.
[5] Inguna Skadina,et al. Collecting and Using Comparable Corpora for Statistical Machine Translation , 2012, LREC.
[6] Maja Popovic,et al. chrF: character n-gram F-score for automatic MT evaluation , 2015, WMT@EMNLP.
[7] Yann Dauphin,et al. Convolutional Sequence to Sequence Learning , 2017, ICML.
[8] Wenhu Chen,et al. Guided Alignment Training for Topic-Aware Neural Machine Translation , 2016, AMTA.
[9] Mark Fishel,et al. Confidence through Attention , 2017, MTSummit.
[10] Matt Post,et al. We start by defining the recurrent architecture as implemented in S OCKEYE , following , 2018 .
[11] André F. T. Martins,et al. Marian: Fast Neural Machine Translation in C++ , 2018, ACL.
[12] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[13] Matthew G. Snover,et al. A Study of Translation Edit Rate with Targeted Human Annotation , 2006, AMTA.
[14] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[15] Philipp Koehn,et al. Findings of the 2018 Conference on Machine Translation (WMT18) , 2018, WMT.
[16] Marcis Pinnis,et al. Integration of Neural Machine Translation Systems for Formatting-Rich Document Translation , 2018, NLDB.