Finding Universal Grammatical Relations in Multilingual BERT
暂无分享,去创建一个
[1] Guillaume Lample,et al. Word Translation Without Parallel Data , 2017, ICLR.
[2] Sampo Pyysalo,et al. Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection , 2020, LREC.
[3] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[4] Julian Michael,et al. Asking without Telling: Exploring Latent Ontologies in Contextual Representations , 2020, EMNLP.
[5] Martin Wattenberg,et al. How to Use t-SNE Effectively , 2016 .
[6] Veselin Stoyanov,et al. Emerging Cross-lingual Structure in Pretrained Language Models , 2020, ACL.
[7] Dan Klein,et al. Multilingual Alignment of Contextual Word Representations , 2020, ICLR.
[8] Dipanjan Das,et al. BERT Rediscovers the Classical NLP Pipeline , 2019, ACL.
[9] Martin Wattenberg,et al. Visualizing and Measuring the Geometry of BERT , 2019, NeurIPS.
[10] Dan Roth,et al. Cross-Lingual Ability of Multilingual BERT: An Empirical Study , 2019, ICLR.
[11] Yijia Liu,et al. Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing , 2019, EMNLP.
[12] Alex Wang,et al. What do you learn from context? Probing for sentence structure in contextualized word representations , 2019, ICLR.
[13] Mark Dredze,et al. Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT , 2019, EMNLP.
[14] Alexander M. Fraser,et al. How Language-Neutral is Multilingual BERT? , 2019, ArXiv.
[15] Ankur Bapna,et al. Investigating Multilingual NMT Representations at Scale , 2019, EMNLP.
[16] Eva Schlinger,et al. How Multilingual is Multilingual BERT? , 2019, ACL.
[17] Regina Barzilay,et al. Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing , 2019, NAACL.
[18] Martin Potthast,et al. CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies , 2018, CoNLL.
[19] Yonatan Belinkov,et al. Linguistic Knowledge and Transferability of Contextual Representations , 2019, NAACL.
[20] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[21] Ryan Cotterell,et al. A Tale of a Probe and a Parser , 2020, ACL.
[22] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[23] Christopher D. Manning,et al. A Structural Probe for Finding Syntax in Word Representations , 2019, NAACL.
[24] Richard Socher,et al. BERT is Not an Interlingua and the Bias of Tokenization , 2019, EMNLP.
[25] Guillaume Lample,et al. What you can cram into a single $&!#* vector: Probing sentence embeddings for linguistic properties , 2018, ACL.
[26] Beatrice Santorini,et al. Building a Large Annotated Corpus of English: The Penn Treebank , 1993, CL.
[27] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[28] John Hewitt,et al. Designing and Interpreting Probes with Control Tasks , 2019, EMNLP.
[29] Geoffrey E. Hinton,et al. Visualizing Data using t-SNE , 2008 .
[30] Jason Eisner,et al. Specializing Word Embeddings (for Parsing) by Information Bottleneck , 2019, EMNLP.