暂无分享,去创建一个
[1] Yoshua Bengio,et al. On the Properties of Neural Machine Translation: Encoder–Decoder Approaches , 2014, SSST@EMNLP.
[2] Geoffrey E. Hinton,et al. Transforming Auto-Encoders , 2011, ICANN.
[3] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[4] Hideki Nakayama,et al. Compressing Word Embeddings via Deep Compositional Code Learning , 2017, ICLR.
[5] Geoffrey E. Hinton,et al. Dynamic Routing Between Capsules , 2017, NIPS.
[6] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[7] Jeffrey L. Elman,et al. Finding Structure in Time , 1990, Cogn. Sci..
[8] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[9] Kaiming He,et al. Focal Loss for Dense Object Detection , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[10] Nitish Srivastava,et al. Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.
[11] Kuldip K. Paliwal,et al. Bidirectional recurrent neural networks , 1997, IEEE Trans. Signal Process..
[12] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[13] Yann LeCun,et al. Very Deep Convolutional Networks for Text Classification , 2016, EACL.
[14] Jianxin Wu,et al. Minimal gated unit for recurrent neural networks , 2016, International Journal of Automation and Computing.
[15] Bo Huang,et al. A New Method of Region Embedding for Text Classification , 2018, ICLR.
[16] Xiang Zhang,et al. Character-level Convolutional Networks for Text Classification , 2015, NIPS.
[17] Clément Farabet,et al. Torch7: A Matlab-like Environment for Machine Learning , 2011, NIPS 2011.
[18] Ross B. Girshick,et al. Focal Loss for Dense Object Detection , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[19] Jürgen Schmidhuber,et al. Learning to Forget: Continual Prediction with LSTM , 2000, Neural Computation.