How Self-Attention Improves Rare Class Performance in a Question-Answering Dialogue Agent
暂无分享,去创建一个
[1] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[2] Evan Jaffe,et al. Combining CNNs and Pattern Matching for Question Interpretation in a Virtual Patient Dialogue System , 2017, BEA@EMNLP.
[3] Omer Levy,et al. SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems , 2019, NeurIPS.
[4] David King,et al. Using Paraphrasing and Memory-Augmented Models to Combat Data Sparsity in Question Interpretation with a Virtual Patient Dialogue System , 2018, BEA@NAACL-HLT.
[5] Evan Jaffe,et al. Interpreting Questions with a Log-Linear Ranking Model in a Virtual Patient Dialogue System , 2015, BEA@NAACL-HLT.
[6] Geoffrey E. Hinton,et al. Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.
[7] Bowen Zhou,et al. A Structured Self-attentive Sentence Embedding , 2017, ICLR.
[8] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[9] Abhijit Mahabal,et al. Text Classification with Few Examples using Controlled Generalization , 2019, NAACL-HLT.
[10] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[11] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[12] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] Jason Weston,et al. Memory Networks , 2014, ICLR.
[15] Omer Levy,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.
[16] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[17] Dipanjan Das,et al. BERT Rediscovers the Classical NLP Pipeline , 2019, ACL.