Using TAGs, a Tree Model, and a Language Model for Generation

Srinivas Bangalore et Owen Rambow AT &T Labs-Research, B233 180 Park Ave, PO Box 971 F1orham Park, NJ 07932-0971, USA srini, rambow@research.att.com 33 Previous stochastic approaches to sentence realization da not include a tree-based representation of syntax. While this may be adequate or even advantageous for some applications, other applications profitfrom using as much syntactic knowledge as is available, leaving to a stochastic model only those issues that are not determined by the grammar. In this paper, we present three results in the context of surface realization: a stochastic tree model derivedfrom a parsed corpus outperforms a tree model derivedfrom unannotated corpus; exploiting a hand-crafted grammar in conjunction with a tree model outpe1fonns a tree model without a grammar; and exploiting a tree model in conjunction with a linear language model outperforms just the tree model.