暂无分享,去创建一个
Tapio Salakoski | Filip Ginter | Jenna Kanerva | Samuel Rönnqvist | T. Salakoski | Jenna Kanerva | Filip Ginter | Samuel Rönnqvist
[1] Benoît Sagot,et al. What Does BERT Learn about the Structure of Language? , 2019, ACL.
[2] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[3] Allyson Ettinger. What BERT Is Not: Lessons from a New Suite of Psycholinguistic Diagnostics for Language Models , 2019, Transactions of the Association for Computational Linguistics.
[4] Eva Schlinger,et al. How Multilingual is Multilingual BERT? , 2019, ACL.
[5] Robert Frank,et al. Open Sesame: Getting inside BERT’s Linguistic Knowledge , 2019, BlackboxNLP@ACL.
[6] Dipanjan Das,et al. BERT Rediscovers the Classical NLP Pipeline , 2019, ACL.
[7] Omer Levy,et al. What Does BERT Look at? An Analysis of BERT’s Attention , 2019, BlackboxNLP@ACL.
[8] Alex Wang,et al. BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model , 2019, Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation.
[9] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[10] Sampo Pyysalo,et al. Universal Dependencies v1: A Multilingual Treebank Collection , 2016, LREC.
[11] Yoav Goldberg,et al. Assessing BERT's Syntactic Abilities , 2019, ArXiv.
[12] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[13] Mark Dredze,et al. Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT , 2019, EMNLP.