Learning OT constraint rankings using a maximum entropy model

Sharon Goldwater, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

Abstract

A weakness of standard Optimality Theory is its inability to account for grammars with free variation. We describe here the Maximum Entropy model, a general statistical model, and show how it can be applied in a constraint-based linguistic framework to model and learn grammars with free variation, as well as categorical grammars. We report the results of using the MaxEnt model for learning two different grammars: one with variation, and one without. Our results are as good as those of a previous probabilistic version of OT, the Gradual Learning Algorithm (Boersma, 1997), and we argue that our model is more general and mathematically well-motivated.
Original languageEnglish
Title of host publicationProceedings of the Stockholm Workshop on Variation within Optimality Theory
EditorsJennifer Spenader, Anders Eriksson, Östen Dahl
Place of PublicationStockholm
PublisherStockholm University
Pages111-120
Number of pages10
Publication statusPublished - 2003
Externally publishedYes
EventStockholm Workshop on Variation within Optimality Theory - Stockholm University, Stockholm, Sweden
Duration: 26 Apr 200327 Apr 2003

Workshop

WorkshopStockholm Workshop on Variation within Optimality Theory
Country/TerritorySweden
CityStockholm
Period26/04/0327/04/03

Fingerprint

Dive into the research topics of 'Learning OT constraint rankings using a maximum entropy model'. Together they form a unique fingerprint.

Cite this