How to Answer Comparison Questions
暂无分享,去创建一个
Xin Chen | Yu Hong | Hongxuan Tang | Kaili Wu | Min Zhang
[1] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[2] Ming Zhou,et al. Reinforced Mnemonic Reader for Machine Reading Comprehension , 2017, IJCAI.
[3] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.
[4] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[5] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[6] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[7] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[8] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[9] Danqi Chen,et al. CoQA: A Conversational Question Answering Challenge , 2018, TACL.
[10] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[11] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[12] Yoshua Bengio,et al. HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering , 2018, EMNLP.
[13] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[14] Eunsol Choi,et al. CONVERSATIONAL MACHINE COMPREHENSION , 2019 .
[15] Hannaneh Hajishirzi,et al. Multi-hop Reading Comprehension through Question Decomposition and Rescoring , 2019, ACL.
[16] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[17] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[18] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[19] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.