A Survey of Domain Adaptation for Machine Translation

[1]  Yong Wang,et al.  Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks , 2019, AAAI.

[2]  Chenhui Chu,et al.  A Comprehensive Empirical Comparison of Domain Adaptation Methods for Neural Machine Translation , 2018, J. Inf. Process..

[3]  Graham Neubig,et al.  Unsupervised Domain Adaptation for Neural Machine Translation with Domain-Aware Feature Embeddings , 2019, EMNLP.

[4]  Roland Kuhn,et al.  Discriminative Instance Weighting for Domain Adaptation in Statistical Machine Translation , 2010, EMNLP.

[5]  Arjan Durresi,et al.  A survey: Control plane scalability issues and approaches in Software-Defined Networking (SDN) , 2017, Comput. Networks.

[6]  Marcello Federico,et al.  Neural vs. Phrase-Based Machine Translation in a Multi-Domain Scenario , 2017, EACL.

[7]  ChengXiang Zhai,et al.  Instance Weighting for Domain Adaptation in NLP , 2007, ACL.

[8]  John DeNero,et al.  Compact Personalized Models for Neural Machine Translation , 2018, EMNLP.

[9]  Rico Sennrich,et al.  Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.

[10]  John DeNero,et al.  Measuring Immediate Adaptation Performance for Neural Machine Translation , 2019, NAACL.

[11]  Arianna Bisazza,et al.  Fill-up versus interpolation methods for phrase-based SMT adaptation , 2011, IWSLT.

[12]  Chenhui Chu,et al.  An Empirical Comparison of Domain Adaptation Methods for Neural Machine Translation , 2017, ACL.

[13]  Francisco Casacuberta,et al.  Online Learning for Effort Reduction in Interactive Neural Machine Translation , 2018, Comput. Speech Lang..

[14]  Marc Dymetman,et al.  Machine Translation of Restaurant Reviews: New Corpus for Domain Adaptation and Robustness , 2019, NGT@EMNLP-IJCNLP.

[15]  Marine Carpuat,et al.  Bi-Directional Neural Machine Translation with Synthetic Parallel Data , 2018, NMT@ACL.

[16]  Jia Xu,et al.  Hunter NMT System for WMT18 Biomedical Translation Task: Transfer Learning in Neural Machine Translation , 2018, WMT.

[17]  Masao Utiyama,et al.  Sentence Embedding for Neural Machine Translation Domain Adaptation , 2017, ACL.

[18]  Smaranda Muresan,et al.  Generalizing Word Lattice Translation , 2008, ACL.

[19]  Rico Sennrich,et al.  A Multi-Domain Translation Model Framework for Statistical Machine Translation , 2013, ACL.

[20]  Rico Sennrich,et al.  Controlling Politeness in Neural Machine Translation via Side Constraints , 2016, NAACL.

[21]  Yang Feng,et al.  Improving Domain Adaptation Translation with Domain Invariant and Specific Information , 2019, NAACL.

[22]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[23]  Gabriela Csurka,et al.  Domain Adaptation for Visual Applications: A Comprehensive Survey , 2017, ArXiv.

[24]  Spyridon Matsoukas,et al.  Discriminative Corpus Weight Estimation for Machine Translation , 2009, EMNLP.

[25]  Hai Zhao,et al.  Neural Network Based Bilingual Language Model Growing for Statistical Machine Translation , 2014, EMNLP.

[26]  Yoshua Bengio,et al.  Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism , 2016, NAACL.

[27]  Hai Zhao,et al.  Connecting Phrase based Statistical Machine Translation Adaptation , 2016, COLING.

[28]  Atsushi Fujita,et al.  Efficient Extraction of Pseudo-Parallel Sentences from Raw Monolingual Data Using Word Embeddings , 2017, ACL.

[29]  Geoffrey E. Hinton,et al.  Distilling the Knowledge in a Neural Network , 2015, ArXiv.

[30]  David Vilar,et al.  Learning Hidden Unit Contribution for Adapting Neural Machine Translation Models , 2018, NAACL.

[31]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[32]  Renjie Zheng,et al.  Robust Machine Translation with Domain Sensitive Pseudo-Sources: Baidu-OSU WMT19 MT Robustness Shared Task System Report , 2019, WMT.

[33]  Peng Xu,et al.  Improved Domain Adaptation for Statistical Machine Translation , 2012, AMTA.

[34]  Quoc V. Le,et al.  Effective Domain Mixing for Neural Machine Translation , 2017, WMT.

[35]  Hong Yu,et al.  MetaMT, a MetaLearning Method Leveraging Multiple Domain Data for Low Resource Machine Translation , 2019, AAAI.

[36]  Rico Sennrich,et al.  Improving Neural Machine Translation Models with Monolingual Data , 2015, ACL.

[37]  Raheel Nawaz,et al.  Exploring Transfer Learning and Domain Data Selection for the Biomedical Translation , 2019, WMT.

[38]  Andy Way,et al.  Combining Multi-Domain Statistical Machine Translation Models using Automatic Classifiers , 2010, AMTA.

[39]  Quoc V. Le,et al.  Addressing the Rare Word Problem in Neural Machine Translation , 2014, ACL.

[40]  Felix Hieber,et al.  Using Target-side Monolingual Data for Neural Machine Translation through Multi-task Learning , 2017, EMNLP.

[41]  Holger Schwenk,et al.  Translation Model Adaptation by Resampling , 2010, WMT@ACL.

[42]  Satoshi Nakamura,et al.  Incorporating Discrete Translation Lexicons into Neural Machine Translation , 2016, EMNLP.

[43]  Huda Khayrallah,et al.  The JHU Machine Translation Systems for WMT 2016 , 2016 .

[44]  Jaehong Park,et al.  Building a Neural Machine Translation System Using Only Synthetic Parallel Data , 2017, ArXiv.

[45]  Shuangzhi Wu,et al.  Effective Soft-Adaptation for Neural Machine Translation , 2019, NLPCC.

[46]  Yoshua Bengio,et al.  On Using Very Large Target Vocabulary for Neural Machine Translation , 2014, ACL.

[47]  Gorka Labaka,et al.  Neural machine translation of clinical texts between long distance languages , 2019, J. Am. Medical Informatics Assoc..

[48]  Graham Neubig,et al.  MTNT: A Testbed for Machine Translation of Noisy Text , 2018, EMNLP.

[49]  Khalil Sima'an,et al.  UvA-DARE ( Digital Academic Repository ) Latent Domain Translation Models in Mix-of-Domains Haystack , 2014 .

[50]  Graham Neubig,et al.  Neural Machine Translation and Sequence-to-sequence Models: A Tutorial , 2017, ArXiv.

[51]  Yonatan Belinkov,et al.  Neural Machine Translation Training in a Multi-Domain Scenario , 2017, IWSLT.

[52]  Huda Khayrallah,et al.  Regularized Training Objective for Continued Training for Domain Adaptation in Neural Machine Translation , 2018, NMT@ACL.

[53]  Laura Jehl,et al.  Document-Level Information as Side Constraints for Improved Neural Patent Translation , 2018, AMTA.

[54]  Holger Schwenk,et al.  A General Framework to Weight Heterogeneous Parallel Data for Model Adaptation in Statistical MT , 2012, AMTA.

[55]  Myle Ott,et al.  Understanding Back-Translation at Scale , 2018, EMNLP.

[56]  Taghi M. Khoshgoftaar,et al.  A survey of transfer learning , 2016, Journal of Big Data.

[57]  Rui Wang,et al.  A Survey of Domain Adaptation for Neural Machine Translation , 2018, COLING.

[58]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[59]  Raj Dabre,et al.  Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation , 2019, MTSummit.

[60]  Jaime Carbonell,et al.  Domain Adaptation of Neural Machine Translation by Lexicon Induction , 2019, ACL.

[61]  Graham Neubig,et al.  Extreme Adaptation for Personalized Neural Machine Translation , 2018, ACL.

[62]  Dragos Stefan Munteanu,et al.  Measuring Machine Translation Errors in New Domains , 2013, TACL.

[63]  Philipp Koehn,et al.  Document-Level Adaptation for Neural Machine Translation , 2018, NMT@ACL.

[64]  Deniz Yuret,et al.  Transfer Learning for Low-Resource Neural Machine Translation , 2016, EMNLP.

[65]  Martin Wattenberg,et al.  Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation , 2016, TACL.

[66]  Tiejun Zhao,et al.  Domain Adaptation for SMT Using Sentence Weight , 2015, CCL.

[67]  Marcello Federico,et al.  Continuous Learning from Human Post-Edits for Neural Machine Translation , 2017, Prague Bull. Math. Linguistics.

[68]  Stefan Riezler,et al.  A user-study on online adaptation of neural machine translation to human post-edits , 2017, Machine Translation.

[69]  Rico Sennrich,et al.  Regularization techniques for fine-tuning in neural machine translation , 2017, EMNLP.

[70]  Deyi Xiong,et al.  Sentence Weighting for Neural Machine Translation Domain Adaptation , 2018, COLING.

[71]  Gorka Labaka,et al.  Spanish-Swedish neural machine translation for the civil engineering domain , 2019 .

[72]  Qun Liu,et al.  Huawei’s NMT Systems for the WMT 2019 Biomedical Translation Task , 2019, WMT.

[73]  Kevin Duh,et al.  Adaptation Data Selection using Neural Language Models: Experiments in Machine Translation , 2013, ACL.

[74]  Gholamreza Haffari,et al.  Sequence to Sequence Mixture Model for Diverse Machine Translation , 2018, CoNLL.

[75]  Patrick Cadwell,et al.  More than tweets , 2019, Translation Spaces.

[76]  Josep Maria Crego,et al.  Domain Control for Neural Machine Translation , 2016, RANLP.

[77]  Lemao Liu,et al.  Instance Weighting for Neural Machine Translation Domain Adaptation , 2017, EMNLP.

[78]  Christof Monz,et al.  Dynamic Data Selection for Neural Machine Translation , 2017, EMNLP.

[79]  Jiajun Zhang,et al.  Exploiting Source-side Monolingual Data in Neural Machine Translation , 2016, EMNLP.

[80]  Andy Way,et al.  Feature Decay Algorithms for Neural Machine Translation , 2018, EAMT.

[81]  Huda Khayrallah,et al.  Neural Lattice Search for Domain Adaptation in Machine Translation , 2017, IJCNLP.

[82]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[83]  Paul Buitelaar,et al.  Translating Domain-Specific Expressions in Knowledge Bases with Neural Machine Translation , 2017, ArXiv.

[84]  Emilio Soria Olivas,et al.  Handbook of Research on Machine Learning Applications and Trends : Algorithms , Methods , and Techniques , 2009 .

[85]  Alexandra Birch,et al.  Mixed domain vs. multi-domain statistical machine translation , 2015, MTSUMMIT.

[86]  Eiichiro Sumita,et al.  Multi-domain Adaptation for Statistical Machine Translation Based on Feature Augmentation , 2016, AMTA.

[87]  Holger Schwenk,et al.  Investigations on Translation Model Adaptation Using Monolingual Data , 2011, WMT@EMNLP.

[88]  William D. Lewis,et al.  Intelligent Selection of Language Model Training Data , 2010, ACL.

[89]  Roland Kuhn,et al.  Mixture-Model Adaptation for SMT , 2007, WMT@ACL.

[90]  Masao Utiyama,et al.  An Empirical Study of Domain Adaptation for Unsupervised Neural Machine Translation , 2019, ArXiv.

[91]  Christophe Servan,et al.  Domain specialization: a post-training domain adaptation for Neural Machine Translation , 2016, ArXiv.

[92]  Chenhui Chu,et al.  Integrated Parallel Data Extraction from Comparable Corpora for Statistical Machine Translation , 2015 .

[93]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[94]  Raj Dabre,et al.  Exploiting Multilingualism through Multistage Fine-Tuning for Low-Resource Neural Machine Translation , 2019, EMNLP.

[95]  Taro Watanabe,et al.  Denoising Neural Machine Translation Training with Trusted Data and Online Data Selection , 2018, WMT.

[96]  Andy Way,et al.  The ADAPT System Description for the IWSLT 2018 Basque to English Translation Task , 2018, IWSLT.

[97]  Jianfeng Gao,et al.  Domain Adaptation via Pseudo In-Domain Data Selection , 2011, EMNLP.

[98]  Huda Khayrallah,et al.  Overcoming Catastrophic Forgetting During Domain Adaptation of Neural Machine Translation , 2019, NAACL.

[99]  Marcello Federico,et al.  Evaluation of Terminology Translation in Instance-Based Neural MT Adaptation , 2018 .