Using left-corner parsing to encode universal structural constraints in grammar induction

Hiroshi Noji, Yusuke Miyao, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

13 Citations (Scopus)

Abstract

Center-embedding is difficult to process and is known as a rare syntactic construction across languages. In this paper we describe a method to incorporate this assumption into the grammar induction tasks by restricting the search space of a model to trees with limited center-embedding. The key idea is the tabulation of left-corner parsing, which captures the degree of center-embedding of a parse via its stack depth. We apply the technique to learning of famous generative model, the dependency model with valence (Klein and Manning, 2004). Cross-linguistic experiments on Universal Dependencies show that often our method boosts the performance from the baseline, and competes with the current state-of-the-art model in a number of languages.

Original languageEnglish
Title of host publicationProceedings of the 2016 Conference on Empirical Methods in Natural Language Processing
PublisherAssociation for Computational Linguistics (ACL)
Pages33-43
Number of pages11
ISBN (Electronic)9781945626258
DOIs
Publication statusPublished - 1 Jan 2016
Event2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016 - Austin, United States
Duration: 1 Nov 20165 Nov 2016

Publication series

NameEMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings

Conference

Conference2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016
CountryUnited States
CityAustin
Period1/11/165/11/16

Fingerprint

Dive into the research topics of 'Using left-corner parsing to encode universal structural constraints in grammar induction'. Together they form a unique fingerprint.

Cite this