A novel algorithm for estimating the parameters of a hidden stochastic context-free grammar is presented. In contrast to the inside/outside (I/O) algorithm it does not require the grammar to be expressed in Chomsky normal form, and thus can operate directly on more natural representations of a grammar. The algorithm uses a trellis-based structure as opposed to the binary branching tree structure used by the I/O algorithm. The form of the trellis is an extension of that used by the forward/backward (F/B) algorithm, and as a result the algorithm reduces to the latter for components that can be modeled as finite-state networks. In the same way that a hidden Markov model (HMM) is a stochastic analog of a finite-state network, the representation used by the algorithm is a stochastic analog of a recursive transition network, in which a state may be simple or itself contain an underlying structure.<<ETX>>
[1]
Julian Kupiec,et al.
Augmenting a Hidden Markov Model for Phrase-Dependent Word Tagging
,
1989,
HLT.
[2]
Steve Young,et al.
Applications of stochastic context-free grammars using the Inside-Outside algorithm
,
1990
.
[3]
J. Baker.
Trainable grammars for speech recognition
,
1979
.
[4]
John Cocke,et al.
Probabilistic Parsing Method for Sentence Disambiguation
,
1989,
IWPT.
[5]
L. R. Rabiner,et al.
An introduction to the application of the theory of probabilistic functions of a Markov process to automatic speech recognition
,
1983,
The Bell System Technical Journal.