Predicting Twitter Engagement With Deep Language Models
暂无分享,去创建一个
Maksims Volkovs | Zhaoyue Cheng | Aidan N. Gomez | Nick Frosst | Jin Peng Zhou | Kevin Shen | Stephen Gou | Saba Zuberi | Carol Chen | Mathieu Ravaut | Hojin Yang | Anson Wong | Ivan Zhang | Helen Ngo | Bharat Venkitesh | M. Volkovs | Kevin Shen | Mathieu Ravaut | S. Zuberi | Stephen Gou | Nick Frosst | Ivan Zhang | Bharat Venkitesh | Carol Chen | Helen Ngo | J. Zhou | Zhaoyue Cheng | Hojin Yang | Anson Wong
[1] Quoc V. Le,et al. ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators , 2020, ICLR.
[2] Veselin Stoyanov,et al. Unsupervised Cross-lingual Representation Learning at Scale , 2019, ACL.
[3] Frank Hutter,et al. Decoupled Weight Decay Regularization , 2017, ICLR.
[4] Jessie J. Smith,et al. Privacy-Preserving Recommender Systems Challenge on Twitter's Home Timeline , 2020, ArXiv.
[5] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[6] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.
[7] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[8] Kartik Talamadupula,et al. Predicting User Engagement on Twitter with Real-World Events , 2015, ICWSM.
[9] Richard A. Johnson,et al. A new family of power transformations to improve normality or symmetry , 2000 .
[10] Ivan Vulic,et al. Unsupervised Cross-Lingual Representation Learning , 2019, ACL.
[11] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[12] John Riedl,et al. Item-based collaborative filtering recommendation algorithms , 2001, WWW '01.