An Application of the Variational Bayesian Approach to Probabilistic Context-Free Grammars
暂无分享,去创建一个
We present an efficient learning algorithm for probabilistic context-free grammars based on the variational Bayesian approach. Although the maximum likelihood method has traditionally been used for learning probabilistic language models, Bayesian learning is, in principle, less likely to cause overfitting problems than the maximum likelihood method. We show that the computational complexity of our algorithm is equal to that of the Inside-Outside algorithm. We also report results of experiments to compare precisions of the Inside-Outside algorithm and our algorithm.
[1] J. Baker. Trainable grammars for speech recognition , 1979 .
[2] J. Hartigan. A failure of likelihood asymptotics for normal mixtures , 1985 .
[3] J. D. Lafferty. A derivation of the Inside-Outside algorithm from the EM algorithm , 1993 .
[4] Hagai Attias,et al. Inferring Parameters and Structure of Latent Variable Models by Variational Bayes , 1999, UAI.
[5] Sumio Watanabe,et al. Algebraic Analysis for Nonidentifiable Learning Machines , 2001, Neural Computation.