Attention Mechanism with BERT for Content Annotation and Categorization of Pregnancy-Related Questions on a Community Q&A Site
暂无分享,去创建一个
Zhan Zhang | Zhe He | Xiao Luo | Haoran Ding | Matthew Tang | Priyanka Gandhi | Zhe He | Zhan Zhang | H. Ding | Xiao Luo | Matthew Tang | P. Gandhi
[1] Feng Zhu,et al. AUTOHOME-ORCA at SemEval-2019 Task 8: Application of BERT for Fact-Checking in Community Forums , 2019, SemEval@NAACL-HLT.
[2] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[3] Tammy Harpel,et al. Pregnant Women Sharing Pregnancy-Related Information on Facebook: Web-Based Survey Study , 2018, Journal of medical Internet research.
[4] S. Steinhubl,et al. The Healthy Pregnancy Research Program: transforming pregnancy research through a ResearchKit app , 2018, bioRxiv.
[5] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[6] R. Evans,et al. Pregnancy-Related Information Seeking and Sharing in the Social Media Era Among Expectant Mothers: Qualitative Study , 2019, Journal of medical Internet research.
[7] Jian Li,et al. Multi-Head Attention with Disagreement Regularization , 2018, EMNLP.
[8] Xing Wu,et al. Conditional BERT Contextual Augmentation , 2018, ICCS.
[9] Benoît Sagot,et al. What Does BERT Learn about the Structure of Language? , 2019, ACL.
[10] Margareta Larsson,et al. A descriptive study of the use of the Internet by women seeking pregnancy-related information. , 2009, Midwifery.
[11] Zhe He,et al. Social Web and Health Research , 2019, Springer International Publishing.
[12] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[13] Omer Levy,et al. What Does BERT Look at? An Analysis of BERT’s Attention , 2019, BlackboxNLP@ACL.