Weighted and probabilistic context-free grammars are equally expressive

Noah A. Smith*, Mark Johnson

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

35 Citations (Scopus)

Abstract

This article studies the relationship between weighted context-free grammars (WCFGs), where each production is associated with a positive real-valued weight, and probabilistic context-free grammars (PCFGs), where the weights of the productions associated with a nonterminal are constrained to sum to one. Because the class of WCFGs properly includes the PCFGs, one might expect that WCFGs can describe distributions that PCFGs cannot. However, Z. Chi (1999, Computational Linguistics, 25(1):131-160) and S. P. Abney, D. A. McAllester, and P. Pereira (1999, In Proceedings of the 37th Annual Meeting of the Association for Computational Linguistics, pages 542-549, College Park, MD) proved that every WCFG distribution is equivalent to some PCFG distribution. We extend their results to conditional distributions, and show that every WCFG conditional distribution of parses given strings is also the conditional distribution defined by some PCFG, even when the WCFG's partition function diverges. This shows that any parsing or labeling accuracy improvement from conditional estimation of WCFGs or conditional random fields (CRFs) over joint estimation of PCFGs or hidden Markov models (HMMs) is due to the estimation procedure rather than the change in model class, because PCFGs and HMMs are exactly as expressive as WCFGs and chain-structured CRFs, respectively.

Original languageEnglish
Pages (from-to)477-491
Number of pages15
JournalComputational Linguistics
Volume33
Issue number4
DOIs
Publication statusPublished - Dec 2007
Externally publishedYes

Fingerprint

Dive into the research topics of 'Weighted and probabilistic context-free grammars are equally expressive'. Together they form a unique fingerprint.

Cite this