Option Attentive Capsule Network for Multi-choice Reading Comprehension
暂无分享,去创建一个
[1] Geoffrey E. Hinton,et al. Dynamic Routing Between Capsules , 2017, NIPS.
[2] Shiyu Chang,et al. A Co-Matching Model for Multi-choice Reading Comprehension , 2018, ACL.
[3] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[4] Zhen-Hua Ling,et al. Neural Natural Language Inference Models Enhanced with External Knowledge , 2017, ACL.
[5] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[6] Geoffrey E. Hinton,et al. Transforming Auto-Encoders , 2011, ICANN.
[7] Wentao Ma,et al. Convolutional Spatial Attention Model for Reading Comprehension with Multiple-Choice Questions , 2019, AAAI.
[8] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[9] Wei Zhang,et al. Evidence Aggregation for Answer Re-Ranking in Open-Domain Question Answering , 2017, ICLR.
[10] Mitesh M. Khapra,et al. ElimiNet: A Model for Eliminating Options for Reading Comprehension with Multiple Choice Questions , 2018, IJCAI.
[11] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[12] Wei Wang,et al. Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering , 2018, ACL.
[13] Hai Zhao,et al. Multi-labeled Relation Extraction with Attentive Capsule Network , 2018, AAAI.
[14] Wei Zhang,et al. Attention-Based Capsule Networks with Dynamic Routing for Relation Extraction , 2018, EMNLP.
[15] Guokun Lai,et al. RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.
[16] Furu Wei,et al. Hierarchical Attention Flow for Multiple-Choice Reading Comprehension , 2018, AAAI.