Attention-Fused Deep Matching Network for Natural Language Inference
暂无分享,去创建一个
Tiejun Zhao | Conghui Zhu | Furu Wei | Lei Cui | Xinchi Chen | Chaoqun Duan | Furu Wei | T. Zhao | Conghui Zhu | Lei Cui | Xinchi Chen | Chaoqun Duan
[1] Rui Yan,et al. Natural Language Inference by Tree-Based Convolution and Heuristic Matching , 2015, ACL.
[2] Xuanjing Huang,et al. Deep Fusion LSTMs for Text Semantic Matching , 2016, ACL.
[3] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[4] Zhen-Hua Ling,et al. Enhanced LSTM for Natural Language Inference , 2016, ACL.
[5] Christopher D. Manning,et al. Natural language inference , 2009 .
[6] Andrew Zisserman,et al. Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.
[7] Zhifang Sui,et al. Reading and Thinking: Re-read LSTM Unit for Textual Entailment Recognition , 2016, COLING.
[8] Bowen Zhou,et al. A Structured Self-attentive Sentence Embedding , 2017, ICLR.
[9] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[10] Shuohang Wang,et al. Learning Natural Language Inference with LSTM , 2015, NAACL.
[11] Christopher Potts,et al. A Fast Unified Model for Parsing and Sentence Understanding , 2016, ACL.
[12] Jakob Uszkoreit,et al. A Decomposable Attention Model for Natural Language Inference , 2016, EMNLP.
[13] Phil Blunsom,et al. Reasoning about Entailment with Neural Attention , 2015, ICLR.
[14] M. V. Rossum,et al. In Neural Computation , 2022 .
[15] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[16] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[17] Zhen-Hua Ling,et al. Recurrent Neural Network-Based Sentence Encoder with Gated Attention for Natural Language Inference , 2017, RepEval@EMNLP.
[18] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[19] Dumitru Erhan,et al. Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[20] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[21] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[22] Noah A. Smith,et al. Tree Edit Models for Recognizing Textual Entailments, Paraphrases, and Answers to Questions , 2010, NAACL.
[23] Hong Yu,et al. Neural Tree Indexers for Text Understanding , 2016, EACL.
[24] Yang Liu,et al. Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention , 2016, ArXiv.
[25] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[26] Bowen Zhou,et al. LSTM-based Deep Learning Models for non-factoid answer selection , 2015, ArXiv.
[27] Zhiguo Wang,et al. Bilateral Multi-Perspective Matching for Natural Language Sentences , 2017, IJCAI.