A Stance Detection Approach Based on Generalized Autoregressive pretrained Language Model in Chinese Microblogs
暂无分享,去创建一个
ZHIZHONG SU | YAOYI XI | RONG CAO | HUIFENG TANG | HANGYU PAN | Zhizhong Su | Yaoyi Xi | Hangyu Pan | Rong Cao | Huifeng Tang
[1] Yu Zhou,et al. Overview of NLPCC Shared Task 4: Stance Detection in Chinese Microblogs , 2016, NLPCC/ICCPOL.
[2] Zhonglei Lu,et al. 一种基于迁移学习及多表征的微博立场分析方法 (Approach of Stance Detection in Micro-blog Based on Transfer Learning and Multi-representation) , 2018, 计算机科学.
[3] Yulan He,et al. Stance Classification with Target-Specific Neural Attention Networks , 2017 .
[4] Hermann Ney,et al. LSTM Neural Networks for Language Modeling , 2012, INTERSPEECH.
[5] Yiming Yang,et al. Transformer-XL: Attentive Language Models beyond a Fixed-Length Context , 2019, ACL.
[6] Ruifeng Xu,et al. Stance Classification with Target-specific Neural Attention , 2017, IJCAI.
[7] Douglas Biber,et al. Adverbial stance types in English , 1988 .
[8] Lukás Burget,et al. Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[9] Guodong Zhou,et al. Exploring Various Linguistic Features for Stance Detection , 2016, NLPCC/ICCPOL.
[10] Yann LeCun,et al. Convolutional networks and applications in vision , 2010, Proceedings of 2010 IEEE International Symposium on Circuits and Systems.
[11] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[12] Yu Fan,et al. 面向自然语言处理的预训练技术研究综述 (Survey of Natural Language Processing Pre-training Techniques) , 2020, 计算机科学.
[13] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[14] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[15] Jeffrey Dean,et al. Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.
[16] Wanxiang Che,et al. Revisiting Pre-Trained Models for Chinese Natural Language Processing , 2020, FINDINGS.