A Reading Comprehension Style Question Answering Model Based On Attention Mechanism
暂无分享,去创建一个
[1] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[2] Mihai Surdeanu,et al. The Stanford CoreNLP Natural Language Processing Toolkit , 2014, ACL.
[3] Jason Weston,et al. Natural Language Processing (Almost) from Scratch , 2011, J. Mach. Learn. Res..
[4] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[5] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[6] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.
[7] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[8] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[9] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[10] Richard Socher,et al. Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.
[11] Shuohang Wang,et al. Learning Natural Language Inference with LSTM , 2015, NAACL.
[12] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[13] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[14] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.