TY - JOUR
T1 - Complex query answering over knowledge graphs foundation model using region embeddings on a lie group
AU - Zhou, Zhengyun
AU - Wan, Guojia
AU - Pan, Shirui
AU - Wu, Jia
AU - Hu, Wenbin
AU - Du, Bo
PY - 2024/5
Y1 - 2024/5
N2 - Answering complex queries with First-order logical operators over knowledge graphs, such as conjunction (∧), disjunction (∨), and negation (¬) is immensely useful for identifying missing knowledge. Recently, neural symbolic reasoning methods have been proposed to map entities and relations into a continuous real vector space and model logical operators as differential neural networks. However, traditional methodss employ negative sampling, which corrupts complex queries to train embeddings. Consequently, these embeddings are susceptible to divergence in the open manifold of Rn. The appropriate regularization is crucial for addressing the divergence of embeddings. In this paper, we introduces a Lie group as a compact embedding space for complex query embedding, enhancing ability to handle the intricacies of knowledge graphs the foundation model. Our method aims to solve the query of disjunctive and conjunctive problems. Entities and queries are represented as a region of a high-dimensional torus, where the projection, intersection, union, and negation of the torus naturally simulate entities and queries. After simulating the operations on the region of the torus we defined, we found that the resulting geometry remains unchanged. Experiments show that our method achieved a significant improvement on FB15K, FB15K-237, and NELL995. Through extensive experiments on datasets FB15K, FB15K-237, and NELL995, our approach demonstrates significant improvements, leveraging the strengths of knowledge graphs foundation model and complex query processing.
AB - Answering complex queries with First-order logical operators over knowledge graphs, such as conjunction (∧), disjunction (∨), and negation (¬) is immensely useful for identifying missing knowledge. Recently, neural symbolic reasoning methods have been proposed to map entities and relations into a continuous real vector space and model logical operators as differential neural networks. However, traditional methodss employ negative sampling, which corrupts complex queries to train embeddings. Consequently, these embeddings are susceptible to divergence in the open manifold of Rn. The appropriate regularization is crucial for addressing the divergence of embeddings. In this paper, we introduces a Lie group as a compact embedding space for complex query embedding, enhancing ability to handle the intricacies of knowledge graphs the foundation model. Our method aims to solve the query of disjunctive and conjunctive problems. Entities and queries are represented as a region of a high-dimensional torus, where the projection, intersection, union, and negation of the torus naturally simulate entities and queries. After simulating the operations on the region of the torus we defined, we found that the resulting geometry remains unchanged. Experiments show that our method achieved a significant improvement on FB15K, FB15K-237, and NELL995. Through extensive experiments on datasets FB15K, FB15K-237, and NELL995, our approach demonstrates significant improvements, leveraging the strengths of knowledge graphs foundation model and complex query processing.
KW - Knowledge grpah
KW - Complex logical reasoning
KW - Multi-hop reasoning
KW - Knowledge reasoning
UR - http://www.scopus.com/inward/record.url?scp=85189934465&partnerID=8YFLogxK
U2 - 10.1007/s11280-024-01254-7
DO - 10.1007/s11280-024-01254-7
M3 - Article
AN - SCOPUS:85189934465
SN - 1386-145X
VL - 27
JO - World Wide Web
JF - World Wide Web
IS - 3
M1 - 23
ER -