Probing Multilingual Sentence Representations With X-Probe
暂无分享,去创建一个
[1] Richard Socher,et al. Learned in Translation: Contextualized Word Vectors , 2017, NIPS.
[2] Philipp Koehn,et al. Europarl: A Parallel Corpus for Statistical Machine Translation , 2005, MTSUMMIT.
[3] Christopher Joseph Pal,et al. Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning , 2018, ICLR.
[4] Felix Hill,et al. Learning Distributed Representations of Sentences from Unlabelled Data , 2016, NAACL.
[5] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[6] Honglak Lee,et al. An efficient framework for learning sentence representations , 2018, ICLR.
[7] Guillaume Lample,et al. Word Translation Without Parallel Data , 2017, ICLR.
[8] Jascha Sohl-Dickstein,et al. SVCCA: Singular Vector Canonical Correlation Analysis for Deep Learning Dynamics and Interpretability , 2017, NIPS.
[9] Allan Jabri,et al. Learning Visually Grounded Sentence Representations , 2018, NAACL.
[10] Kevin Gimpel,et al. Charagram: Embedding Words and Sentences via Character n-grams , 2016, EMNLP.
[11] Willem H. Zuidema,et al. Visualisation and 'diagnostic classifiers' reveal how recurrent and recursive neural networks process hierarchical structure , 2017, J. Artif. Intell. Res..
[12] Yonatan Belinkov,et al. Linguistic Knowledge and Transferability of Contextual Representations , 2019, NAACL.
[13] Mona T. Diab,et al. Scalable Cross-Lingual Transfer of Neural Sentence Embeddings , 2019, *SEMEVAL.
[14] Guillaume Lample,et al. XNLI: Evaluating Cross-lingual Sentence Representations , 2018, EMNLP.
[15] Sanja Fidler,et al. Skip-Thought Vectors , 2015, NIPS.
[16] Philipp Koehn,et al. Moses: Open Source Toolkit for Statistical Machine Translation , 2007, ACL.
[17] Yonatan Belinkov,et al. What do Neural Machine Translation Models Learn about Morphology? , 2017, ACL.
[18] Matthijs Douze,et al. Learning Joint Multilingual Sentence Representations with Neural Machine Translation , 2017, Rep4NLP@ACL.
[19] Adam Lopez,et al. Understanding Learning Dynamics Of Language Models with SVCCA , 2018, NAACL.
[20] Mostafa Abdou,et al. MGAD: Multilingual Generation of Analogy Datasets , 2018, LREC.
[21] Tibor Kiss,et al. Unsupervised Multilingual Sentence Boundary Detection , 2006, CL.
[22] Guillaume Lample,et al. What you can cram into a single $&!#* vector: Probing sentence embeddings for linguistic properties , 2018, ACL.
[23] Eneko Agirre,et al. Learning principled bilingual mappings of word embeddings while preserving monolingual invariance , 2016, EMNLP.
[24] Jörg Tiedemann,et al. Parallel Data, Tools and Interfaces in OPUS , 2012, LREC.
[25] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[26] Tomas Mikolov,et al. Enriching Word Vectors with Subword Information , 2016, TACL.
[27] Mirella Lapata,et al. Vector-based Models of Semantic Composition , 2008, ACL.
[28] Bowen Zhou,et al. A Structured Self-attentive Sentence Embedding , 2017, ICLR.
[29] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[30] Milan Straka,et al. Tokenizing, POS Tagging, Lemmatizing and Parsing UD 2.0 with UDPipe , 2017, CoNLL.
[31] Nan Hua,et al. Universal Sentence Encoder , 2018, ArXiv.
[32] Zhe Gan,et al. Learning Generic Sentence Representations Using Convolutional Neural Networks , 2016, EMNLP.
[33] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[34] Douwe Kiela,et al. SentEval: An Evaluation Toolkit for Universal Sentence Representations , 2018, LREC.
[35] Holger Schwenk,et al. Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond , 2018, Transactions of the Association for Computational Linguistics.
[36] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[37] Holger Schwenk,et al. Supervised Learning of Universal Sentence Representations from Natural Language Inference Data , 2017, EMNLP.
[38] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.