Dynamic Representation Learning via Recurrent Graph Neural Networks

被引:11
作者
Zhang, Chun-Yang [1 ]
Yao, Zhi-Liang [1 ]
Yao, Hong-Yu [1 ]
Huang, Feng [2 ]
Chen, C. L. Philip [3 ]
机构
[1] Fuzhou Univ, Sch Comp & Data Sci, Fuzhou 350025, Peoples R China
[2] Fuzhou Univ, Sch Mech Engn & Automat, Fuzhou 350025, Peoples R China
[3] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Guangdong, Peoples R China
来源
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS | 2023年 / 53卷 / 02期
基金
中国国家自然科学基金;
关键词
Representation learning; Recurrent neural networks; Matrix decomposition; Feature extraction; Computational modeling; Biological system modeling; Task analysis; Dynamic graphs; graph neural networks (GNNs); graph representation learning (GRL); node embeddings; recurrent neural network (RNN); COMMUNITY;
D O I
10.1109/TSMC.2022.3196506
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A large number of real-world systems generate graphs that are structured data aligned with nodes and edges. Graphs are usually dynamic in many scenarios, where nodes or edges keep evolving over time. Recently, graph representation learning (GRL) has received great success in network analysis, which aims to produce informative and representative features or low-dimensional embeddings by exploring node attributes and network topology. Most state-of-the-art models for dynamic GRL are composed of a static representation learning model and a recurrent neural network (RNN). The former generates the representations of a graph or nodes from one static graph at a discrete time step, while the latter captures the temporal correlation between adjacent graphs. However, the two-stage design ignores the temporal dynamics between contiguous graphs during the learning processing of graph representations. To alleviate this problem, this article proposes a representation learning model for dynamic graphs, called DynGNN. Differently, it is a single-stage model that embeds an RNN into a graph neural network to produce better representations in a compact form. This takes the fusion of temporal and topology correlations into account from low-level to high-level feature learning, enabling the model to capture more fine-grained evolving patterns. From the experimental results on both synthetic and real-world networks, the proposed DynGNN yields significant improvements in multiple tasks compared to the state-of-the-art counterparts.
引用
收藏
页码:1284 / 1297
页数:14
相关论文
共 50 条
  • [1] Dynamic Graph Representation Learning With Neural Networks: A Survey
    Yang, Leshanshui
    Chatelain, Clement
    Adam, Sebastien
    IEEE ACCESS, 2024, 12 : 43460 - 43484
  • [2] DRRNets: Dynamic Recurrent Routing via Low-Rank Regularization in Recurrent Neural Networks
    Shan, Dongjing
    Luo, Yong
    Zhang, Xiongwei
    Zhang, Chao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (04) : 2057 - 2067
  • [3] A Novel Representation Learning for Dynamic Graphs Based on Graph Convolutional Networks
    Gao, Chao
    Zhu, Junyou
    Zhang, Fan
    Wang, Zhen
    Li, Xuelong
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) : 3599 - 3612
  • [4] Dynamic Graph Representation Learning via Coupling-Process Model
    Duan, Pingtao
    Zhou, Chuan
    Liu, Yuting
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12383 - 12395
  • [5] Temporal Aggregation and Propagation Graph Neural Networks for Dynamic Representation
    Zheng, Tongya
    Wang, Xinchao
    Feng, Zunlei
    Song, Jie
    Hao, Yunzhi
    Song, Mingli
    Wang, Xingen
    Wang, Xinyu
    Chen, Chun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10151 - 10165
  • [6] Multiple Kernel Representation Learning on Networks
    Celikkanat, Abdulkadir
    Shen, Yanning
    Malliaros, Fragkiskos D.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (06) : 6113 - 6125
  • [7] FL-GNNs: Robust Network Representation via Feature Learning Guided Graph Neural Networks
    Wang, Beibei
    Jiang, Bo
    Ding, Chris
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (01): : 750 - 760
  • [8] DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks
    Sankar, Aravind
    Wu, Yanhong
    Gou, Liang
    Zhang, Wei
    Yang, Hao
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 519 - 527
  • [9] Generative Representation Learning in Recurrent Neural Networks for Causal Timeseries Forecasting
    Chatziparaskevas, Georgios
    Mademlis, Ioannis
    Pitas, Ioannis
    IEEE Transactions on Artificial Intelligence, 2024, 5 (12): : 6412 - 6425
  • [10] Learning Deep Graph Representations via Convolutional Neural Networks
    Ye, Wei
    Askarisichani, Omid
    Jones, Alex
    Singh, Ambuj
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (05) : 2268 - 2279