暂无分享,去创建一个
[1] Eunsol Choi,et al. CONVERSATIONAL MACHINE COMPREHENSION , 2019 .
[2] Yelong Shen,et al. FusionNet: Fusing via Fully-Aware Attention with Application to Machine Comprehension , 2017, ICLR.
[3] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[4] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[5] Ariel D. Procaccia,et al. Variational Dropout and the Local Reparameterization Trick , 2015, NIPS.
[6] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.
[7] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[8] Xiaodong Liu,et al. Stochastic Answer Networks for Machine Reading Comprehension , 2017, ACL.
[9] Christopher D. Manning,et al. Get To The Point: Summarization with Pointer-Generator Networks , 2017, ACL.
[10] Eunsol Choi,et al. QuAC: Question Answering in Context , 2018, EMNLP.
[11] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[12] Mark Yatskar,et al. A Qualitative Comparison of CoQA, SQuAD 2.0 and QuAC , 2018, NAACL.
[13] Danqi Chen,et al. CoQA: A Conversational Question Answering Challenge , 2018, TACL.
[14] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[15] Ahmed Elgohary,et al. A dataset and baselines for sequential open-domain question answering , 2018, EMNLP.
[16] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[17] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.