Contrastive representation learning on dynamic networks

被引:2
作者
Jiao, Pengfei [1 ,3 ]
Chen, Hongjiang [1 ]
Tang, Huijun [1 ]
Bao, Qing [1 ]
Zhang, Long [2 ]
Zhao, Zhidong [1 ,3 ]
Wu, Huaming [4 ]
机构
[1] Hangzhou Dianzi Univ, Sch Cyberspace, Hangzhou 310018, Peoples R China
[2] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
[3] Data Secur Governance Zhejiang Engn Res Ctr, Hangzhou 310018, Peoples R China
[4] Tianjin Univ, Ctr Appl Math, Tianjin 300072, Peoples R China
关键词
Dynamic network; Contrastive learning; Mutual information; Representation learning;
D O I
10.1016/j.neunet.2024.106240
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Representation learning for dynamic networks is designed to learn the low -dimensional embeddings of nodes that can well preserve the snapshot structure, properties and temporal evolution of dynamic networks. However, current dynamic network representation learning methods tend to focus on estimating or generating observed snapshot structures, paying excessive attention to network details, and disregarding distinctions between snapshots with larger time intervals, resulting in less robustness for sparse or noisy networks. To alleviate these challenges, this paper proposes a contrastive mechanism for temporal representation learning on dynamic networks, inspired by the success of contrastive learning in visual and static network representation learning. This paper proposes a novel Dynamic Network Contrastive representation Learning (DNCL) model. Specifically, contrast objective functions are constructed using intra-snapshot and inter -snapshot contrasts to capture the network topology, node feature information, and network evolution information, respectively. Rather than estimating or generating ground -truth network features, the proposed approach maximizes mutual information between nodes from different time steps and views generated. The experimental results of link prediction, node classification, and clustering on several real -world and synthetic networks demonstrate the superiority of DNCL over state-of-the-art methods, indicating the effectiveness of the proposed approach for dynamic network representation learning.
引用
收藏
页数:12
相关论文
共 47 条
[1]  
Bachman P, 2019, ADV NEUR IN, V32
[2]   Temporal Graph Representation Learning with Adaptive Augmentation Contrastive [J].
Chen, Hongjiang ;
Jiao, Pengfei ;
Tang, Huijun ;
Wu, Huaming .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT II, 2023, 14170 :683-699
[3]   E-LSTM-D: A Deep Learning Framework for Dynamic Network Link Prediction [J].
Chen, Jinyin ;
Zhang, Jian ;
Xu, Xuanheng ;
Fu, Chenbo ;
Zhang, Dan ;
Zhang, Qingpeng ;
Xuan, Qi .
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (06) :3699-3712
[4]  
Chen T, 2020, PR MACH LEARN RES, V119
[5]  
Cho K., 2014, P 2014 C EMP METH NA, DOI [DOI 10.3115/V1/D14-1179, 10.3115]
[6]  
Du L, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2086
[7]   Elements of information theory, 2nd edition [J].
Edwards, Sheri .
INFORMATION PROCESSING & MANAGEMENT, 2008, 44 (01) :400-401
[8]   A Novel Representation Learning for Dynamic Graphs Based on Graph Convolutional Networks [J].
Gao, Chao ;
Zhu, Junyou ;
Zhang, Fan ;
Wang, Zhen ;
Li, Xuelong .
IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) :3599-3612
[9]  
Gehrke J., 2003, ACM SIGKDD Explor Newsl, V5, P149, DOI [10.1145/980972.980992, DOI 10.1145/980972.980992]
[10]  
Goyal P., 2018, Dyngem: Deep embedding method for dynamic graphs