Review of Deep Learning Techniques for Improving the Performance of Machine Reading Comprehension Problem
暂无分享,去创建一个
[1] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[2] Yelong Shen,et al. ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.
[3] Wang Ling,et al. Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation , 2015, EMNLP.
[4] Philip Bachman,et al. NewsQA: A Machine Comprehension Dataset , 2016, Rep4NLP@ACL.
[5] Zhiguo Wang,et al. Multi-Perspective Context Matching for Machine Comprehension , 2016, ArXiv.
[6] Debajyoti Chatterjee. Making Neural Machine Reading Comprehension Faster , 2019, ArXiv.
[7] Walter Daelemans,et al. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) , 2014, EMNLP 2014.
[8] Quoc V. Le,et al. QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension , 2018, ICLR.
[9] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[10] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[11] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[12] W. Marsden. I and J , 2012 .
[13] Dirk Weissenborn,et al. Making Neural QA as Simple as Possible but not Simpler , 2017, CoNLL.
[14] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[15] Harksoo Kim,et al. GF-Net: Improving machine reading comprehension with feature gates , 2020, Pattern Recognit. Lett..
[16] Xiaodong Liu,et al. Stochastic Answer Networks for Machine Reading Comprehension , 2017, ACL.
[17] Bowen Zhou,et al. End-to-End Answer Chunk Extraction and Ranking for Reading Comprehension , 2016, 1610.09996.
[18] Jianfeng Gao,et al. A Human Generated MAchine Reading COmprehension Dataset , 2018 .
[19] Ting Liu,et al. Consensus Attention-based Neural Networks for Chinese Reading Comprehension , 2016, COLING.
[20] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[21] Alexander J. Smola,et al. Stacked Attention Networks for Image Question Answering , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[22] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[23] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[24] Shuohang Wang,et al. Learning Natural Language Inference with LSTM , 2015, NAACL.
[25] Deng Cai,et al. MEMEN: Multi-layer Embedding with Memory Networks for Machine Comprehension , 2017, ArXiv.
[26] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.
[27] Caiquan Xiong,et al. Multiple Attention Networks with Temporal Convolution for Machine Reading Comprehension , 2019, 2019 IEEE 9th International Conference on Electronics Information and Emergency Communication (ICEIEC).
[28] Philip Bachman,et al. Natural Language Comprehension with the EpiReader , 2016, EMNLP.
[29] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[30] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[31] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[32] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[33] Ming Zhou,et al. Reinforced Mnemonic Reader for Machine Reading Comprehension , 2017, IJCAI.
[35] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[36] Eunsol Choi,et al. TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension , 2017, ACL.
[37] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[38] David A. McAllester,et al. Who did What: A Large-Scale Person-Centered Cloze Dataset , 2016, EMNLP.
[39] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[40] Dirk Weissenborn,et al. FastQA: A Simple and Efficient Neural Architecture for Question Answering , 2017, ArXiv.
[41] Weijie Liu,et al. Enhancing Machine Reading Comprehension With Position Information , 2019, IEEE Access.
[42] Michael I. Jordan,et al. Advances in Neural Information Processing Systems 30 , 1995 .
[43] Florence March,et al. 2016 , 2016, Affair of the Heart.
[44] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[45] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[46] Gary Marchionini,et al. A study on video browsing strategies , 1997 .
[47] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[48] Wenpeng Yin,et al. Attention-Based Convolutional Neural Network for Machine Comprehension , 2016, ArXiv.
[49] Hui Wang,et al. R-Trans: RNN Transformer Network for Chinese Machine Reading Comprehension , 2019, IEEE Access.
[50] Philip Bachman,et al. Iterative Alternating Neural Attention for Machine Reading , 2016, ArXiv.