Can Monolingual Pretrained Models Help Cross-Lingual Classification?
暂无分享,去创建一个
Li Dong | Furu Wei | Zewen Chi | Heyan Huang | Xian-Ling Mao
[1] Claire Cardie,et al. Adversarial Deep Averaging Networks for Cross-Lingual Sentiment Classification , 2016, TACL.
[2] Yichao Lu,et al. Adversarial Learning with Contextual Embeddings for Zero-resource Cross-lingual Classification and NER , 2019, EMNLP/IJCNLP.
[3] Mark Dredze,et al. Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT , 2019, EMNLP.
[4] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[5] Li Dong,et al. Cross-Lingual Natural Language Generation via Pre-Training , 2020, AAAI.
[6] Ankur Bapna,et al. Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges , 2019, ArXiv.
[7] Benno Stein,et al. Cross-Language Text Classification Using Structural Correspondence Learning , 2010, ACL.
[8] Eva Schlinger,et al. How Multilingual is Multilingual BERT? , 2019, ACL.
[9] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[10] Haoran Li,et al. Multilingual Seq2seq Training with Similarity Loss for Cross-Lingual Document Classification , 2018, Rep4NLP@ACL.
[11] Yiming Yang,et al. Cross-lingual Distillation for Text Classification , 2017, ACL.
[12] Guillaume Lample,et al. XNLI: Evaluating Cross-lingual Sentence Representations , 2018, EMNLP.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] Sebastian Ruder,et al. MultiFiT: Efficient Multi-lingual Language Model Fine-tuning , 2019, EMNLP/IJCNLP.
[15] Holger Schwenk,et al. Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond , 2018, Transactions of the Association for Computational Linguistics.
[16] Dong-Hyun Lee,et al. Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks , 2013 .
[17] Ming Zhou,et al. InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training , 2020, NAACL.
[18] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[19] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[20] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.