Fighting Filterbubbles with Adversarial Training
暂无分享,去创建一个
[1] Eli Pariser,et al. The Filter Bubble: What the Internet Is Hiding from You , 2011 .
[2] Thomas Wolf,et al. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter , 2019, ArXiv.
[3] R'emi Louf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[4] Xing Xie,et al. MIND: A Large-scale Dataset for News Recommendation , 2020, ACL.
[5] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[6] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[7] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[8] Andreas Spitz,et al. Exploring Significant Interactions in Live News , 2018, NewsIR@ECIR.