ATNet: Answering Cloze-Style Questions via Intra-attention and Inter-attention
暂无分享,去创建一个
[1] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[2] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[3] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[4] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[5] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[6] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[7] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[8] Richard Socher,et al. Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.
[9] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[10] Yelong Shen,et al. ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.
[11] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[12] Hong Yu,et al. Neural Semantic Encoders , 2016, EACL.
[13] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.
[14] Philip Bachman,et al. Natural Language Comprehension with the EpiReader , 2016, EMNLP.
[15] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[16] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[17] Mirella Lapata,et al. Long Short-Term Memory-Networks for Machine Reading , 2016, EMNLP.