A Multi-View Abstractive Summarization Model Jointly Considering Semantics and Sentiment
暂无分享,去创建一个
[1] Hang Li,et al. A Deep Memory-based Architecture for Sequence-to-Sequence Learning , 2015 .
[2] Julien Perez,et al. Gated End-to-End Memory Networks , 2016, EACL.
[3] Quoc V. Le,et al. Addressing the Rare Word Problem in Neural Machine Translation , 2014, ACL.
[4] Zhiyuan Liu,et al. A C-LSTM Neural Network for Text Classification , 2015, ArXiv.
[5] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[6] Jason Weston,et al. A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.
[7] Jason Weston,et al. Key-Value Memory Networks for Directly Reading Documents , 2016, EMNLP.
[8] Bowen Zhou,et al. Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond , 2016, CoNLL.
[9] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[10] Udo Kruschwitz,et al. MultiLing 2015: Multilingual Summarization of Single and Multi-Documents, On-line Fora, and Call-center Conversations , 2015, SIGDIAL Conference.
[11] Chin-Yew Lin,et al. ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.
[12] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[13] Wei Heng,et al. CIST System Report for ACL MultiLing 2013 – Track 1: Multilingual Multi-document Summarization , 2013 .
[14] Kuldip K. Paliwal,et al. Bidirectional recurrent neural networks , 1997, IEEE Trans. Signal Process..
[15] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.