Heterogeneous recurrent neural networks for natural language model
暂无分享,去创建一个
Neural networks for language model are proposed and their performances are explored. The proposed network consists of two recurrent networks of which structures are different to each other. Both networks accept words as their inputs, translate their distributed representation, and produce the probabilities of words to occur from their sequence of input words. Performances for the proposed network are investigated through constructions for language models, as compared with a single recurrent neural and a long short-term memory network.
[1] Hermann Ney,et al. LSTM Neural Networks for Language Modeling , 2012, INTERSPEECH.
[2] Jürgen Schmidhuber,et al. Learning to forget: continual prediction with LSTM , 1999 .
[3] Yoshua Bengio,et al. Gated Feedback Recurrent Neural Networks , 2015, ICML.
[4] Beatrice Santorini,et al. Building a Large Annotated Corpus of English: The Penn Treebank , 1993, CL.
[5] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.