Adjusting BERT's Pooling Layer for Large-Scale Multi-Label Text Classification
暂无分享,去创建一个
[1] Ramesh Nallapati,et al. Universal Text Representation from BERT: An Empirical Study , 2019, ArXiv.
[2] Omer Levy,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.
[3] Grigorios Tsoumakas,et al. WISE 2014 Challenge: Multi-label Classification of Print Media Articles to Topics , 2014, WISE.
[4] Sebastian Stabinger,et al. Adapt or Get Left Behind: Domain Adaptation through BERT Language Model Finetuning for Aspect-Target Sentiment Classification , 2020, LREC.
[5] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[6] Ion Androutsopoulos,et al. Large-Scale Multi-Label Text Classification on EU Legislation , 2019, ACL.
[7] Mirella Lapata,et al. Text Summarization with Pretrained Encoders , 2019, EMNLP.
[8] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[9] Xuanjing Huang,et al. How to Fine-Tune BERT for Text Classification? , 2019, CCL.