Modeling Local Dependence in Natural Language with Multi-channel Recurrent Neural Networks
暂无分享,去创建一个
Gang Wang | Tie-Yan Liu | Hongwei Wang | Chang Xu | Weiran Huang | G. Wang | Tie-Yan Liu | Hongwei Wang | Chang Xu | Weiran Huang
[1] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[2] Maosong Sun,et al. Neural Headline Generation with Sentence-wise Optimization , 2016 .
[3] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[4] Joelle Pineau,et al. An Actor-Critic Algorithm for Sequence Prediction , 2016, ICLR.
[5] Yann Dauphin,et al. Convolutional Sequence to Sequence Learning , 2017, ICML.
[6] Chris Dyer,et al. On the State of the Art of Evaluation in Neural Language Models , 2017, ICLR.
[7] Chong Wang,et al. Neural Phrase-based Machine Translation , 2017, ArXiv.
[8] Masaaki Nagata,et al. Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization , 2016, EACL.
[9] J. Koenderink. Q… , 2014, Les noms officiels des communes de Wallonie, de Bruxelles-Capitale et de la communaute germanophone.
[10] Wei Chen,et al. Unsupervised Neural Machine Translation with Weight Sharing , 2018 .
[11] Stephen Clark,et al. Jointly learning sentence embeddings and syntax with unsupervised Tree-LSTMs , 2017, Natural Language Engineering.
[12] Richard Socher,et al. Regularizing and Optimizing LSTM Language Models , 2017, ICLR.
[13] Tim Rocktäschel,et al. Frustratingly Short Attention Spans in Neural Language Modeling , 2017, ICLR.
[14] Marcello Federico,et al. Report on the 11th IWSLT evaluation campaign , 2014, IWSLT.
[15] Jing He,et al. A Sequence-to-Sequence Model for User Simulation in Spoken Dialogue Systems , 2016, INTERSPEECH.
[16] Feng Yu,et al. Recurrent Highway Networks With Grouped Auxiliary Memory , 2019, IEEE Access.
[17] Jason Weston,et al. A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.
[18] Marc'Aurelio Ranzato,et al. Sequence Level Training with Recurrent Neural Networks , 2015, ICLR.
[19] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[20] Hakan Inan,et al. Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling , 2016, ICLR.
[21] Hongyu Guo,et al. Long Short-Term Memory Over Recursive Structures , 2015, ICML.
[22] Hai Zhao,et al. Attention Is All You Need for Chinese Word Segmentation , 2019, EMNLP.
[23] Alexander M. Rush,et al. Structured Attention Networks , 2017, ICLR.
[24] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[25] Fei Tian,et al. Recurrent Residual Learning for Sequence Classification , 2016, EMNLP.
[26] Tsuyoshi Murata,et al. {m , 1934, ACML.
[27] Yoshua Bengio,et al. Hierarchical Multiscale Recurrent Neural Networks , 2016, ICLR.
[28] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[29] Rohollah Soltani Bidgoli. Higher Order Recurrent Neural Network for Language Modeling , 2016 .
[30] Jürgen Schmidhuber,et al. A Clockwork RNN , 2014, ICML.
[31] Thomas S. Huang,et al. Dilated Recurrent Neural Networks , 2017, NIPS.
[32] Chin-Yew Lin,et al. ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.
[33] Rongrong Ji,et al. Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation , 2016, AAAI.
[34] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[35] Kilian Q. Weinberger,et al. Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[36] Quoc V. Le,et al. Neural Architecture Search with Reinforcement Learning , 2016, ICLR.