LVBERT: Transformer-Based Model for Latvian Language Understanding
暂无分享,去创建一个
[1] Taku Kudo,et al. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing , 2018, EMNLP.
[2] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[3] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[4] Tapio Salakoski,et al. Multilingual is not enough: BERT for Finnish , 2019, ArXiv.
[5] Arturs Znotins,et al. NLP-PIPE: Latvian NLP Tool Pipeline , 2018, Baltic HLT.
[6] Peteris Paikens,et al. Creation of a Balanced State-of-the-Art Multilayer Corpus for NLU , 2018, LREC.
[7] 知秀 柴田. 5分で分かる!? 有名論文ナナメ読み:Jacob Devlin et al. : BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding , 2020 .
[8] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[9] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[10] Peteris Paikens,et al. Morphological analysis with limited resources: Latvian example , 2013, NODALIDA.
[11] Laurent Romary,et al. CamemBERT: a Tasty French Language Model , 2019, ACL.
[12] Peteris Paikens,et al. Deep Neural Learning Approaches for Latvian Morphological Tagging , 2016, Baltic HLT.
[13] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[14] Sanja Fidler,et al. Aligning Books and Movies: Towards Story-Like Visual Explanations by Watching Movies and Reading Books , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[15] Timothy Dozat,et al. Deep Biaffine Attention for Neural Dependency Parsing , 2016, ICLR.