ITNLP-ARC at SemEval-2018 Task 12: Argument Reasoning Comprehension with Attention
暂无分享,去创建一个
[1] Chengqi Zhang,et al. Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling , 2018, IJCAI.
[2] Benno Stein,et al. The Argument Reasoning Comprehension Task , 2017, ArXiv.
[3] Phil Blunsom,et al. Reasoning about Entailment with Neural Attention , 2015, ICLR.
[4] Meng Wang,et al. Topic driven multimodal similarity learning with multi-view voted convolutional features , 2018, Pattern Recognit..
[5] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[6] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[7] Yoshua Bengio,et al. Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.
[8] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[9] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[10] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[11] Lukás Burget,et al. Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[12] Zhen-Hua Ling,et al. Enhanced LSTM for Natural Language Inference , 2016, ACL.