A novel neural network model for joint POS tagging and graph-based dependency parsing

Dat Quoc Nguyen, Mark Dras, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

26 Citations (Scopus)

Abstract

We present a novel neural network model that learns POS tagging and graph-based dependency parsing jointly. Our model uses bidirectional LSTMs to learn feature representations shared for both POS tagging and dependency parsing tasks, thus handling the feature-engineering problem. Our extensive experiments, on 19 languages from the Universal Dependencies project, show that our model outperforms the state-of-the-art neural network-based Stack-propagation model for joint POS tagging and transition-based dependency parsing, resulting in a new state of the art. Our code is open-source and available together with pre-trained models at: https://github.com/datquocnguyen/jPTDP
Original languageEnglish
Title of host publicationProceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
Place of PublicationStroudsburg PA
PublisherAssociation for Computational Linguistics (ACL)
Pages134-142
Number of pages9
ISBN (Print)9781945626708
DOIs
Publication statusPublished - 2017
EventThe SIGNLL Conference on Computational Natural Language Learning - Vancouver
Duration: 3 Aug 20174 Aug 2017

Conference

ConferenceThe SIGNLL Conference on Computational Natural Language Learning
CityVancouver
Period3/08/174/08/17

Bibliographical note

Copyright the Publisher. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • neural network
  • POS taging
  • dependency parsing
  • bidirectional LSTM
  • universal dependencies
  • multilingual parsing

Fingerprint

Dive into the research topics of 'A novel neural network model for joint POS tagging and graph-based dependency parsing'. Together they form a unique fingerprint.

Cite this