DaCy: A Unified Framework for Danish NLP
暂无分享,去创建一个
[1] Timothy Baldwin,et al. Social Media: Friend or Foe of Natural Language Processing? , 2012, PACLIC.
[2] Sameer Singh,et al. Beyond Accuracy: Behavioral Testing of NLP Models with CheckList , 2020, ACL.
[3] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[4] Colin Raffel,et al. An Empirical Survey of Data Augmentation for Limited Data Learning in NLP , 2021, TACL.
[5] Nina Tahmasebi,et al. A Study on Word2Vec on a Historical Swedish Newspaper Corpus , 2018, DHN.
[6] Steven Skiena,et al. Polyglot: Distributed Word Representations for Multilingual NLP , 2013, CoNLL.
[7] Thomas Wolf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[8] Quoc V. Le,et al. ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators , 2020, ICLR.
[9] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[10] Eduard Hovy,et al. A Survey of Data Augmentation Approaches for NLP , 2021, FINDINGS.
[11] Shuicheng Yan,et al. ConvBERT: Improving BERT with Span-based Dynamic Convolution , 2020, NeurIPS.
[12] Annie Louis,et al. Book Reviews: Natural Language Processing for Social Media by Atefeh Farzindar and Diana Inkpen , 2015, CL.
[13] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[14] Qiang Yang,et al. An Overview of Multi-task Learning , 2018 .
[15] Sebastian Ruder,et al. An Overview of Multi-Task Learning in Deep Neural Networks , 2017, ArXiv.
[16] Colin Raffel,et al. mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer , 2021, NAACL.
[17] Héctor Martínez Alonso,et al. Universal Dependencies for Danish , 2015 .
[18] Sonal Gupta,et al. Muppet: Massive Multi-task Representations with Pre-Finetuning , 2021, EMNLP.
[19] Anders Søgaard,et al. DaNE: A Named Entity Resource for Danish , 2020, LREC.
[20] Taghi M. Khoshgoftaar,et al. A survey on Image Data Augmentation for Deep Learning , 2019, Journal of Big Data.
[21] Christopher D. Manning. Part-of-Speech Tagging from 97% to 100%: Is It Time for Some Linguistics? , 2011, CICLing.
[22] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[23] Rich Caruana,et al. Multitask Learning , 1998, Encyclopedia of Machine Learning and Data Mining.
[24] Maria Barrett,et al. DaNLP: An open-source toolkit for Danish Natural Language Processing , 2021, NODALIDA.
[25] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[26] Roland Vollgraf,et al. FLAIR: An Easy-to-Use Framework for State-of-the-Art NLP , 2019, NAACL.
[27] Christopher D. Manning,et al. Stanza: A Python Natural Language Processing Toolkit for Many Human Languages , 2020, ACL.
[28] Riccardo Fusaroli,et al. The Danish Gigaword Corpus , 2021, NODALIDA.
[29] Kai Zou,et al. EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks , 2019, EMNLP.