Neighborhood mixture model for knowledge base completion

Dat Quoc Nguyen, Kairit Sirts, Lizhen Qu, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

33 Citations (Scopus)
30 Downloads (Pure)


Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE—a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other state-of-the-art embedding models on three benchmark datasets for triple classification, entity prediction and relation prediction tasks.
Original languageEnglish
Title of host publicationProceedings of the 20th SIGNLL Conference on Computational Natural Language Learning (CoNLL)
Place of PublicationStroudsburg, PA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages11
ISBN (Electronic)9781945626197
Publication statusPublished - 2016
EventThe SIGNLL Conference on Computational Natural Language Learning (20th : 2016) - Berlin, Germany
Duration: 11 Aug 201612 Aug 2016
Conference number: 20th


ConferenceThe SIGNLL Conference on Computational Natural Language Learning (20th : 2016)
Abbreviated titleCoNLL 2016

Bibliographical note

Copyright the Publisher 2016. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.


  • Knowledge base completion
  • embedding model
  • mixture model
  • link prediction
  • triple classification
  • entity prediction
  • relation prediction


Dive into the research topics of 'Neighborhood mixture model for knowledge base completion'. Together they form a unique fingerprint.

Cite this