HuggingFace's Transformers: State-of-the-art Natural Language Processing
暂无分享,去创建一个
Lysandre Debut | Victor Sanh | Julien Chaumond | Clement Delangue | Anthony Moi | Pierric Cistac | Tim Rault | Morgan Funtowicz | Jamie Brew | Thomas Wolf | Rémi Louf | Rémi Louf | Thomas Wolf | Victor Sanh | Julien Chaumond | Clement Delangue | Lysandre Debut | Anthony Moi | Pierric Cistac | T. Rault | Morgan Funtowicz | Jamie Brew | Tim Rault
[1] Arman Cohan,et al. Longformer: The Long-Document Transformer , 2020, ArXiv.
[2] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[3] Eric P. Xing,et al. Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation , 2018, ACL.
[4] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[5] Douwe Kiela,et al. Supervised Multimodal Bitransformers for Classifying Images and Text , 2019, ViGIL@NeurIPS.
[6] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.
[7] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[8] Haichen Shen,et al. TVM: An Automated End-to-End Optimizing Compiler for Deep Learning , 2018, OSDI.
[9] Omer Levy,et al. SpanBERT: Improving Pre-training by Representing and Predicting Spans , 2019, TACL.
[10] Guillaume Lample,et al. Cross-lingual Language Model Pretraining , 2019, NeurIPS.
[11] Alex Wang,et al. jiant: A Software Toolkit for Research on General-Purpose Text Understanding Models , 2020, ACL.
[12] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[13] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[14] Christopher D. Manning,et al. Stanza: A Python Natural Language Processing Toolkit for Many Human Languages , 2020, ACL.
[15] Omer Levy,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.
[16] Alexander M. Rush,et al. LSTMVis: A Tool for Visual Analysis of Hidden State Dynamics in Recurrent Neural Networks , 2016, IEEE Transactions on Visualization and Computer Graphics.
[17] Omer Levy,et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.
[18] Omer Levy,et al. SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems , 2019, NeurIPS.
[19] Myle Ott,et al. fairseq: A Fast, Extensible Toolkit for Sequence Modeling , 2019, NAACL.
[20] Alexander M. Rush,et al. OpenNMT: Open-Source Toolkit for Neural Machine Translation , 2017, ACL.
[21] Mihai Surdeanu,et al. The Stanford CoreNLP Natural Language Processing Toolkit , 2014, ACL.
[22] Thomas Wolf,et al. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter , 2019, ArXiv.
[23] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[24] Inioluwa Deborah Raji,et al. Model Cards for Model Reporting , 2018, FAT.
[25] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[26] Sebastian Gehrmann,et al. exBERT: A Visual Analysis Tool to Explore Learned Representations in Transformers Models , 2019, ArXiv.
[27] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL.
[28] Iz Beltagy,et al. SciBERT: A Pretrained Language Model for Scientific Text , 2019, EMNLP.
[29] Mohammad Shoeybi,et al. Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism , 2019, ArXiv.
[30] Lukasz Kaiser,et al. Reformer: The Efficient Transformer , 2020, ICLR.
[31] Yejin Choi,et al. COMET: Commonsense Transformers for Automatic Knowledge Graph Construction , 2019, ACL.
[32] Quoc V. Le,et al. ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators , 2020, ICLR.
[33] Dipanjan Das,et al. BERT Rediscovers the Classical NLP Pipeline , 2019, ACL.
[34] Richard Socher,et al. Learned in Translation: Contextualized Word Vectors , 2017, NIPS.
[35] Luke S. Zettlemoyer,et al. AllenNLP: A Deep Semantic Natural Language Processing Platform , 2018, ArXiv.
[36] Benjamin Lecouteux,et al. FlauBERT: Unsupervised Language Model Pre-training for French , 2020, LREC.
[37] Roland Vollgraf,et al. FLAIR: An Easy-to-Use Framework for State-of-the-Art NLP , 2019, NAACL.
[38] Yiming Yang,et al. Transformer-XL: Attentive Language Models beyond a Fixed-Length Context , 2019, ACL.
[39] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .