Heterogeneous recurrent neural networks for natural language model

Neural networks for language model are proposed and their performances are explored. The proposed network consists of two recurrent networks of which structures are different to each other. Both networks accept words as their inputs, translate their distributed representation, and produce the probabilities of words to occur from their sequence of input words. Performances for the proposed network are investigated through constructions for language models, as compared with a single recurrent neural and a long short-term memory network.