Connectionist language modeling for large vocabulary continuous speech recognition

This paper describes ongoing work on a new approach for language modeling for large vocabulary continuous speech recognition. Almost all state.. o. f-the-art systems use statistical n-gram language models estimated on text corpora. One principle problem with such language models is the fact that many of the n-grams are never observed even in very large training corpora, and therefore it is common to back-off to a lower-order model. In this paper we propose to address this problem by carrying out the estimation task in a continuous space, enabling a smooth interpolation of the probabilities. A neural network is used to learn the projection of the words onto a continuous space and to estimate the n-gram probabilities. The connectionist language model is being evaluated on the DARPA HUB5 conversational telephone speech recognition task and preliminary results show consistent improvements in both perplexity and word error rate.

[1]  John B. Shoven,et al.  I , Edinburgh Medical and Surgical Journal.

[2]  Arun D Kulkarni,et al.  Neural Networks for Pattern Recognition , 1991 .

[3]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[4]  Marcello Federico,et al.  Language Model Adaptation , 1999 .

[5]  Jean-Luc Gauvain,et al.  Fast decoding for indexation of broadcast data , 2000, INTERSPEECH.

[6]  Yoshua Bengio,et al.  A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..

[7]  Jean-Luc Gauvain,et al.  The LIMSI Broadcast News transcription system , 2002, Speech Commun..

[8]  R. C. Whaley,et al.  Automatically Tuned Linear Algebra Software (ATLAS) , 2011, Encyclopedia of Parallel Computing.