暂无分享,去创建一个
[1] Roberto Navigli,et al. Knowledge-enhanced document embeddings for text classification , 2019, Knowl. Based Syst..
[2] Christopher D. Manning,et al. Learning Distributed Representations for Multilingual Text Sequences , 2015, VS@HLT-NAACL.
[3] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[4] Guillaume Lample,et al. Massively Multilingual Word Embeddings , 2016, ArXiv.
[5] Holger Schwenk,et al. Margin-based Parallel Corpus Mining with Multilingual Sentence Embeddings , 2018, ACL.
[6] Graham Neubig,et al. Should All Cross-Lingual Embeddings Speak English? , 2020, ACL.
[7] Holger Schwenk,et al. A Corpus for Multilingual Document Classification in Eight Languages , 2018, LREC.
[8] Olivier Pietquin,et al. MultiVec: a Multilingual and Multilevel Representation Learning Toolkit for NLP , 2016, LREC.
[9] Yoshua Bengio,et al. BilBOWA: Fast Bilingual Distributed Representations without Word Alignments , 2014, ICML.
[10] Ivan Titov,et al. Inducing Crosslingual Distributed Representations of Words , 2012, COLING.
[11] Yiming Yang,et al. RCV1: A New Benchmark Collection for Text Categorization Research , 2004, J. Mach. Learn. Res..
[12] Achim Rettinger,et al. Bilingual Word Embeddings from Parallel and Non-parallel Corpora for Cross-Language Text Classification , 2016, NAACL.
[13] Guillaume Wenzek,et al. Trans-gram, Fast Cross-lingual Word-embeddings , 2015, EMNLP.
[14] Phil Blunsom,et al. Multilingual Models for Compositional Distributed Semantics , 2014, ACL.
[15] Holger Schwenk,et al. Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond , 2018, Transactions of the Association for Computational Linguistics.
[16] Philipp Koehn,et al. Europarl: A Parallel Corpus for Statistical Machine Translation , 2005, MTSUMMIT.
[17] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[18] Matthijs Douze,et al. Learning Joint Multilingual Sentence Representations with Neural Machine Translation , 2017, Rep4NLP@ACL.
[19] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[20] Hugo Larochelle,et al. An Autoencoder Approach to Learning Bilingual Word Representations , 2014, NIPS.
[21] Myle Ott,et al. fairseq: A Fast, Extensible Toolkit for Sequence Modeling , 2019, NAACL.
[22] Quoc V. Le,et al. Distributed Representations of Sentences and Documents , 2014, ICML.
[23] Mark Dredze,et al. Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT , 2019, EMNLP.
[24] André F. T. Martins,et al. Jointly Learning to Embed and Predict with Multiple Languages , 2016, ACL.