暂无分享,去创建一个
[1] Lav R. Varshney,et al. CTRL: A Conditional Transformer Language Model for Controllable Generation , 2019, ArXiv.
[2] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[3] Charles A. Sutton,et al. Autoencoding Variational Inference For Topic Models , 2017, ICLR.
[4] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[5] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[6] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[7] Philipp Koehn,et al. Europarl: A Parallel Corpus for Statistical Machine Translation , 2005, MTSUMMIT.
[8] Jason Yosinski,et al. Plug and Play Language Models: A Simple Approach to Controlled Text Generation , 2020, ICLR.
[9] Michael I. Jordan,et al. Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..
[10] Dirk Hovy,et al. Pre-training is a Hot Topic: Contextualized Document Embeddings Improve Topic Coherence , 2020, ArXiv.
[11] Malvina Nissim,et al. GePpeTto Carves Italian into a Language Model , 2020, CLiC-it.
[12] R'emi Louf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[13] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[14] Dirk Hovy,et al. Cross-lingual Contextualized Topic Models with Zero-shot Learning , 2020, EACL.
[15] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[16] Percy Liang,et al. Delete, Retrieve, Generate: a Simple Approach to Sentiment and Style Transfer , 2018, NAACL.