Refining Raw Sentence Representations for Textual Entailment Recognition via Attention
暂无分享,去创建一个
[1] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[2] Yang Liu,et al. Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention , 2016, ArXiv.
[3] J. Benthem. A brief history of natural logic , 2008 .
[4] Bowen Zhou,et al. A Structured Self-attentive Sentence Embedding , 2017, ICLR.
[5] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[6] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[7] Zhiguo Wang,et al. Bilateral Multi-Perspective Matching for Natural Language Sentences , 2017, IJCAI.
[8] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[9] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[10] Geoffrey E. Hinton,et al. Grammar as a Foreign Language , 2014, NIPS.
[11] Zhen-Hua Ling,et al. Enhanced LSTM for Natural Language Inference , 2016, ACL.
[12] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[13] Zhi Jin,et al. Discriminative Neural Sentence Modeling by Tree-Based Convolution , 2015, EMNLP.
[14] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.