Commonsense Evidence Generation and Injection in Reading Comprehension
暂无分享,去创建一个
[1] Hai Zhao,et al. Dual Co-Matching Network for Multi-choice Reading Comprehension , 2020, AAAI.
[2] Michael McCloskey,et al. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .
[3] Xiang Ren,et al. KagNet: Knowledge-Aware Graph Networks for Commonsense Reasoning , 2019, EMNLP.
[4] Ruize Wang,et al. K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters , 2020, ArXiv.
[5] Wentao Ma,et al. Convolutional Spatial Attention Model for Reading Comprehension with Multiple-Choice Questions , 2018, AAAI.
[6] Yejin Choi,et al. SWAG: A Large-Scale Adversarial Dataset for Grounded Commonsense Inference , 2018, EMNLP.
[7] Xiang Li,et al. Commonsense Knowledge Base Completion , 2016, ACL.
[8] H. Andersen. ABDUCTIVE AND DEDUCTIVE CHANGE , 1973 .
[9] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[10] Junji Tomita,et al. Commonsense Knowledge Base Completion and Generation , 2018, CoNLL.
[11] Jianshu Chen,et al. Teaching Pretrained Models with Commonsense Reasoning: A Preliminary KB-Based Approach , 2019, ArXiv.
[12] Lei Zheng,et al. Texygen: A Benchmarking Platform for Text Generation Models , 2018, SIGIR.
[13] Yejin Choi,et al. COMET: Commonsense Transformers for Automatic Knowledge Graph Construction , 2019, ACL.
[14] Alex Wang,et al. BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model , 2019, Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation.
[15] Jonathan Berant,et al. oLMpics-On What Language Model Pre-training Captures , 2019, Transactions of the Association for Computational Linguistics.
[16] Lukasz Kaiser,et al. Generating Wikipedia by Summarizing Long Sequences , 2018, ICLR.
[17] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[18] Erik T. Mueller,et al. Open Mind Common Sense: Knowledge Acquisition from the General Public , 2002, OTM.
[19] Quoc V. Le,et al. QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension , 2018, ICLR.
[20] Maosong Sun,et al. ERNIE: Enhanced Language Representation with Informative Entities , 2019, ACL.
[21] Anna Korhonen,et al. Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity , 2019, COLING.
[22] Wenhan Xiong,et al. Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model , 2019, ICLR.
[23] Wei Zhao,et al. Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension , 2018, SemEval@NAACL-HLT.
[24] Min Tang,et al. Multi-Matching Network for Multiple Choice Reading Comprehension , 2019, AAAI.
[25] Furu Wei,et al. Hierarchical Attention Flow for Multiple-Choice Reading Comprehension , 2018, AAAI.
[26] Richard Socher,et al. Explain Yourself! Leveraging Language Models for Commonsense Reasoning , 2019, ACL.
[27] Solomon Eyal Shimony,et al. Probabilistic Semantics for Cost Based Abduction , 1990, AAAI.
[28] Yejin Choi,et al. Cosmos QA: Machine Reading Comprehension with Contextual Commonsense Reasoning , 2019, EMNLP.
[29] Jerry R. Hobbs,et al. Interpretation as Abduction , 1993, Artif. Intell..
[30] Xiaodong Liu,et al. Unified Language Model Pre-training for Natural Language Understanding and Generation , 2019, NeurIPS.
[31] Goran Glavas,et al. Informing Unsupervised Pretraining with External Linguistic Knowledge , 2019, ArXiv.
[32] Yoav Shoham,et al. SenseBERT: Driving Some Sense into BERT , 2019, ACL.
[33] Catherine Havasi,et al. ConceptNet 5.5: An Open Multilingual Graph of General Knowledge , 2016, AAAI.
[34] Geoffrey E. Hinton,et al. Dynamic Routing Between Capsules , 2017, NIPS.
[35] Shiyu Chang,et al. A Co-Matching Model for Multi-choice Reading Comprehension , 2018, ACL.
[36] Jonathan Berant,et al. CommonsenseQA: A Question Answering Challenge Targeting Commonsense Knowledge , 2019, NAACL.
[37] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[38] Lantao Yu,et al. SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient , 2016, AAAI.
[39] Nan Duan,et al. Graph-Based Reasoning over Heterogeneous External Knowledge for Commonsense Question Answering , 2019, AAAI.