Event Sparse Net: Sparse Dynamic Graph Multi-representation Learning with Temporal Attention for Event-Based Data

被引:0
作者
Li, Dan [1 ]
Huang, Teng [1 ]
Hong, Jie [1 ]
Hong, Yile [1 ]
Wang, Jiaqi [1 ]
Wang, Zhen [2 ]
Zhang, Xi [3 ,4 ]
机构
[1] Guangzhou Univ, Inst Artificial Intelligence & Blockchain, Guangzhou, Peoples R China
[2] Zhejiang Lab, Kechuang Ave, Hangzhou, Zhejiang, Peoples R China
[3] Sun Yat Sen Univ, Sch Arts, Guangzhou, Peoples R China
[4] Univ Colorado Boulder, Coll Mus, Boulder, CO 80309 USA
来源
PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IX | 2024年 / 14433卷
基金
中国国家自然科学基金;
关键词
dynamic graph representations; self-attention mechanism; light sparse temporal model; link prediction;
D O I
10.1007/978-981-99-8546-3_17
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph structure data has seen widespread utilization in modeling and learning representations, with dynamic graph neural networks being a popular choice. However, existing approaches to dynamic representation learning suffer from either discrete learning, leading to the loss of temporal information, or continuous learning, which entails significant computational burdens. Regarding these issues, we propose an innovative dynamic graph neural network called Event Sparse Net (ESN). By encoding time information adaptively as snapshots and there is an identical amount of temporal structure in each snapshot, our approach achieves continuous and precise time encoding while avoiding potential information loss in snapshot-based methods. Additionally, we introduce a lightweight module, namely Global Temporal Attention, for computing node representations based on temporal dynamics and structural neighborhoods. By simplifying the fully-connected attention fusion, our approach significantly reduces computational costs compared to the currently best-performing methods. We assess our methodology on four continuous/discrete graph datasets for link prediction to assess its effectiveness. In comparison experiments with top-notch baseline models, ESN achieves competitive performance with faster inference speed.
引用
收藏
页码:208 / 219
页数:12
相关论文
empty
未找到相关数据