Abstract
A weakness of standard Optimality Theory is its inability to account for grammars with free variation. We describe here the Maximum Entropy model, a general statistical model, and show how it can be applied in a constraint-based linguistic framework to model and learn grammars with free variation, as well as categorical grammars. We report the results of using the MaxEnt model for learning two different grammars: one with variation, and one without. Our results are as good as those of a previous probabilistic version of OT, the Gradual Learning Algorithm (Boersma, 1997), and we argue that our model is more general and mathematically well-motivated.
Original language | English |
---|---|
Title of host publication | Proceedings of the Stockholm Workshop on Variation within Optimality Theory |
Editors | Jennifer Spenader, Anders Eriksson, Östen Dahl |
Place of Publication | Stockholm |
Publisher | Stockholm University |
Pages | 111-120 |
Number of pages | 10 |
Publication status | Published - 2003 |
Externally published | Yes |
Event | Stockholm Workshop on Variation within Optimality Theory - Stockholm University, Stockholm, Sweden Duration: 26 Apr 2003 → 27 Apr 2003 |
Workshop
Workshop | Stockholm Workshop on Variation within Optimality Theory |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 26/04/03 → 27/04/03 |