TY - GEN
T1 - Context-aware temporal knowledge graph embedding
AU - Liu, Yu
AU - Hua, Wen
AU - Xin, Kexuan
AU - Zhou, Xiaofang
PY - 2019/11/1
Y1 - 2019/11/1
N2 - Knowledge graph embedding (KGE) is an important technique used for knowledge graph completion (KGC). However, knowledge in practice is time-variant and many relations are only valid for a certain period of time. This phenomenon highlights the importance of temporal knowledge graph embeddings. Currently, existing temporal KGE methods only focus on one aspect of facts, i.e., the factual plausibility, while ignoring the other aspect, i.e., the temporal consistency. Temporal consistency models the interactions between a fact and its contexts, and thus is able to capture fine-granularity temporal relationships, such as temporal orders, temporal distances and overlapping. In order to determine the useful contexts for the fact to be predicted, we propose a two-way strategy for context selection. In particular, we decompose the target fact into two parts, relation and entities, and measure the usefulness of a context for each part respectively. Furthermore, we adopt deep neural networks to encode contexts and score the temporal consistency. This consistency is used with factual plausibility to model a fact. Due to the incorporation of temporal information and the interactions between facts and contexts, our model learns a more representative embeddings for temporal KG. We conduct extensive experiments on real world datasets and the experimental results verify the effectiveness of our proposals.
AB - Knowledge graph embedding (KGE) is an important technique used for knowledge graph completion (KGC). However, knowledge in practice is time-variant and many relations are only valid for a certain period of time. This phenomenon highlights the importance of temporal knowledge graph embeddings. Currently, existing temporal KGE methods only focus on one aspect of facts, i.e., the factual plausibility, while ignoring the other aspect, i.e., the temporal consistency. Temporal consistency models the interactions between a fact and its contexts, and thus is able to capture fine-granularity temporal relationships, such as temporal orders, temporal distances and overlapping. In order to determine the useful contexts for the fact to be predicted, we propose a two-way strategy for context selection. In particular, we decompose the target fact into two parts, relation and entities, and measure the usefulness of a context for each part respectively. Furthermore, we adopt deep neural networks to encode contexts and score the temporal consistency. This consistency is used with factual plausibility to model a fact. Due to the incorporation of temporal information and the interactions between facts and contexts, our model learns a more representative embeddings for temporal KG. We conduct extensive experiments on real world datasets and the experimental results verify the effectiveness of our proposals.
KW - Knowledge graph embedding
KW - Temporal consistency
KW - Factual plausibility
KW - Context-aware embedding
UR - https://www.scopus.com/pages/publications/85077007217
U2 - 10.1007/978-3-030-34223-4_37
DO - 10.1007/978-3-030-34223-4_37
M3 - Conference proceeding contribution
SN - 9783030342227
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 583
EP - 598
BT - Web Information Systems Engineering – WISE 2019
A2 - Cheng, Reynold
A2 - Mamoulis, Nikos
A2 - Sun, Yizhou
A2 - Huang, Xin
PB - Springer, Springer Nature
CY - Switzerland
T2 - 20th International Conference on Web Information Systems Engineering, WISE 2019
Y2 - 19 January 2020 through 22 January 2020
ER -