KARNA at COIN Shared Task 1: Bidirectional Encoder Representations from Transformers with relational knowledge for machine comprehension with common sense
暂无分享,去创建一个
[1] Yejin Choi,et al. Event2Mind: Commonsense Inference on Events, Intents, and Reactions , 2018, ACL.
[2] Simon Ostermann,et al. SemEval-2018 Task 11: Machine Comprehension Using Commonsense Knowledge , 2018, *SEMEVAL.
[3] Gerhard Weikum,et al. WebChild 2.0 : Fine-Grained Commonsense Knowledge Distillation , 2017, ACL.
[4] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[5] Xinlei Chen,et al. Never-Ending Learning , 2012, ECAI.
[6] Wei Zhao,et al. Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension , 2018, SemEval@NAACL-HLT.
[7] Vatsal Mahajan. Winograd Schema - Knowledge Extraction Using Narrative Chains , 2018, ArXiv.
[8] Guokun Lai,et al. RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.
[9] Yejin Choi,et al. SWAG: A Large-Scale Adversarial Dataset for Grounded Commonsense Inference , 2018, EMNLP.
[10] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.