Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE—a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other state-of-the-art embedding models on three benchmark datasets for triple classification, entity prediction and relation prediction tasks.
|Title of host publication||Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning (CoNLL)|
|Publisher||Association for Computational Linguistics (ACL)|
|Number of pages||11|
|Publication status||Published - 2016|
|Event||The SIGNLL Conference on Computational Natural Language Learning (20th : 2016) - Berlin, Germany|
Duration: 11 Aug 2016 → 12 Aug 2016
Conference number: 20th
|Conference||The SIGNLL Conference on Computational Natural Language Learning (20th : 2016)|
|Abbreviated title||CoNLL 2016|
|Period||11/08/16 → 12/08/16|
Bibliographical noteCopyright the Publisher 2016. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.
- Knowledge base completion
- embedding model
- mixture model
- link prediction
- triple classification
- entity prediction
- relation prediction
Nguyen, D. Q., Sirts, K., Qu, L., & Johnson, M. (2016). Neighborhood mixture model for knowledge base completion. In Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning (CoNLL) (pp. 40-50). Association for Computational Linguistics (ACL).