暂无分享,去创建一个
Shiliang Zhang | Li-Rong Dai | Hui Jiang | Si Wei | Lirong Dai | Hui Jiang | Si Wei | Shiliang Zhang
[1] Shiliang Zhang,et al. The Fixed-Size Ordinally-Forgetting Encoding Method for Neural Network Language Models , 2015, ACL.
[2] Jason Weston,et al. Large-scale Simple Question Answering with Memory Networks , 2015, ArXiv.
[3] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[4] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[5] Jason Weston,et al. Memory Networks , 2014, ICLR.
[6] Yoshua Bengio,et al. Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.
[7] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[8] Tomas Mikolov,et al. Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets , 2015, NIPS.
[9] Alex Graves,et al. Neural Turing Machines , 2014, ArXiv.
[10] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[11] Lukás Burget,et al. Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[12] Marc'Aurelio Ranzato,et al. Learning Longer Memory in Recurrent Neural Networks , 2014, ICLR.
[13] Yoshua Bengio,et al. Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.
[14] P. Cochat,et al. Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.
[15] Alex Graves,et al. Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.