Projects per year
Abstract
Link prediction based on knowledge graph embeddings (KGE) aims to predict new triples to automatically construct knowledge graphs (KGs). However, recent KGE models achieve performance improvements by excessively increasing the embedding dimensions, which may cause enormous training costs and require more storage space. In this paper, instead of training high-dimensional models, we propose MulDE, a novel knowledge distillation framework, which includes multiple low-dimensional hyperbolic KGE models as teachers and two student components, namely Junior and Senior. Under a novel iterative distillation strategy, the Junior component, a low-dimensional KGE model, asks teachers actively based on its preliminary prediction results, and the Senior component integrates teachers' knowledge adaptively to train the Junior component based on two mechanisms: relation-specific scaling and contrast attention. The experimental results show that MulDE can effectively improve the performance and training speed of low-dimensional KGE models. The distilled 32-dimensional model is competitive compared to the state-of-the-art high-dimensional methods on several widely-used datasets.
Original language | English |
---|---|
Title of host publication | The Web Conference 2021 |
Subtitle of host publication | Proceedings of the World Wide Web Conference, WWW 2021 |
Place of Publication | New York, NY |
Publisher | Association for Computing Machinery, Inc |
Pages | 1716-1726 |
Number of pages | 11 |
ISBN (Electronic) | 9781450383127 |
DOIs | |
Publication status | Published - 2021 |
Event | 2021 World Wide Web Conference, WWW 2021 - Ljubljana, Slovenia Duration: 19 Apr 2021 → 23 Apr 2021 |
Conference
Conference | 2021 World Wide Web Conference, WWW 2021 |
---|---|
Country/Territory | Slovenia |
City | Ljubljana |
Period | 19/04/21 → 23/04/21 |
Bibliographical note
Copyright the Publisher 2021. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.Keywords
- Knowledge distillation
- Knowledge graph
- Knowledge graph embeddings
- Link prediction
Fingerprint
Dive into the research topics of 'MulDE: multi-teacher knowledge distillation for low-dimensional knowledge graph embeddings'. Together they form a unique fingerprint.-
-
What Can You Trust in the Large and Noisy Web?
Sheng, M., Yang, J., Zhang, W. & Dustdar, S.
1/05/20 → 30/04/23
Project: Research