THCN: A Hawkes Process Based Temporal Causal Convolutional Network for Extrapolation Reasoning in Temporal Knowledge Graphs

被引:2
作者
Chen, Tingxuan [1 ]
Long, Jun [2 ]
Wang, Zidong [1 ]
Luo, Shuai
Huang, Jincai [2 ]
Yang, Liu [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Hunan, Peoples R China
[2] Cent South Univ, Big Data Inst, Changsha 410083, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Cognition; Extrapolation; Knowledge graphs; History; Reinforcement learning; Recurrent neural networks; Measurement; Mathematical models; Kernel; Interpolation; Deep learning; extrapolation reasoning; hawkes process; link prediction; temporal knowledge graph;
D O I
10.1109/TKDE.2024.3474051
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Temporal Knowledge Graphs (TKGs) serve as indispensable tools for dynamic facts storage and reasoning. However, predicting future facts in TKGs presents a formidable challenge due to the unknowable nature of future facts. Existing temporal reasoning models depend on fact recurrence and periodicity, leading to information degradation over prolonged temporal evolution. In particular, the occurrence of one fact may influence the likelihood of another. To this end, we propose THCN, a novel Temporal Causal Convolutional Network based on Hawkes processes, designed for temporal reasoning under the extrapolation setting. Specifically, THCN harnesses a temporal causal convolutional network with dilated factors to capture historical dependencies among facts spanning diverse time intervals. Then, we construct a conditional intensity function based on Hawkes processes for fitting the likelihood of fact occurrence. Importantly, THCN pioneers a dual-level dynamic modeling mechanism, enabling the simultaneous capture of the collective features of nodes and the individual characteristics of facts. Extensive experiments on six real-world TKG datasets demonstrate our method significantly outperforms the state-of-the-art across all four evaluation metrics, indicating that THCN is more applicable for extrapolation reasoning in TKGs.
引用
收藏
页码:9374 / 9387
页数:14
相关论文
共 47 条
[1]  
Andersen R, 2006, ANN IEEE SYMP FOUND, P475
[2]   Scaling Graph Neural Networks with Approximate PageRank [J].
Bojchevski, Aleksandar ;
Klicpera, Johannes ;
Perozzi, Bryan ;
Kapoor, Amol ;
Blais, Martin ;
Rozemberczki, Benedek ;
Lukasik, Michal ;
Guennemann, Stephan .
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, :2464-2473
[3]  
Chen L, 2020, AAAI CONF ARTIF INTE, V34, P27
[4]   How Do the Open Source Communities Address Usability and UX Issues? An Exploratory Study [J].
Cheng, Jinghui ;
Guo, Jin L. C. .
CHI 2018: EXTENDED ABSTRACTS OF THE 2018 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2018,
[5]   Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks [J].
Chiang, Wei-Lin ;
Liu, Xuanqing ;
Si, Si ;
Li, Yang ;
Bengio, Samy ;
Hsieh, Cho-Jui .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :257-266
[6]   GRAND plus : Scalable Graph Random Neural Networks [J].
Feng, Wenzheng ;
Dong, Yuxiao ;
Huang, Tinglin ;
Yin, Ziqi ;
Cheng, Xu ;
Kharlamov, Evgeny ;
Tang, Jie .
PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, :3248-3258
[7]  
Frasca F, 2020, Arxiv, DOI arXiv:2004.11198
[8]  
Hamilton WL, 2017, ADV NEUR IN, V30
[9]   LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation [J].
He, Xiangnan ;
Deng, Kuan ;
Wang, Xiang ;
Li, Yan ;
Zhang, Yongdong ;
Wang, Meng .
PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, :639-648
[10]  
Huang Keke, 2023, P ACM WEB C 2023, P1723, DOI DOI 10.1145/3543507.3583408