Dynamic Representation Learning via Recurrent Graph Neural Networks

被引:11
|
作者
Zhang, Chun-Yang [1 ]
Yao, Zhi-Liang [1 ]
Yao, Hong-Yu [1 ]
Huang, Feng [2 ]
Chen, C. L. Philip [3 ]
机构
[1] Fuzhou Univ, Sch Comp & Data Sci, Fuzhou 350025, Peoples R China
[2] Fuzhou Univ, Sch Mech Engn & Automat, Fuzhou 350025, Peoples R China
[3] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Representation learning; Recurrent neural networks; Matrix decomposition; Feature extraction; Computational modeling; Biological system modeling; Task analysis; Dynamic graphs; graph neural networks (GNNs); graph representation learning (GRL); node embeddings; recurrent neural network (RNN); COMMUNITY;
D O I
10.1109/TSMC.2022.3196506
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A large number of real-world systems generate graphs that are structured data aligned with nodes and edges. Graphs are usually dynamic in many scenarios, where nodes or edges keep evolving over time. Recently, graph representation learning (GRL) has received great success in network analysis, which aims to produce informative and representative features or low-dimensional embeddings by exploring node attributes and network topology. Most state-of-the-art models for dynamic GRL are composed of a static representation learning model and a recurrent neural network (RNN). The former generates the representations of a graph or nodes from one static graph at a discrete time step, while the latter captures the temporal correlation between adjacent graphs. However, the two-stage design ignores the temporal dynamics between contiguous graphs during the learning processing of graph representations. To alleviate this problem, this article proposes a representation learning model for dynamic graphs, called DynGNN. Differently, it is a single-stage model that embeds an RNN into a graph neural network to produce better representations in a compact form. This takes the fusion of temporal and topology correlations into account from low-level to high-level feature learning, enabling the model to capture more fine-grained evolving patterns. From the experimental results on both synthetic and real-world networks, the proposed DynGNN yields significant improvements in multiple tasks compared to the state-of-the-art counterparts.
引用
收藏
页码:1284 / 1297
页数:14
相关论文
共 50 条
  • [41] Gated Graph Recurrent Neural Networks
    Ruiz, Luana
    Gama, Fernando
    Ribeiro, Alejandro
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 6303 - 6318
  • [42] Convolutional Recurrent Neural Networks: Learning Spatial Dependencies for Image Representation
    Zuo, Zhen
    Shuai, Bing
    Wang, Gang
    Liu, Xiao
    Wang, Xingxing
    Wang, Bing
    Chen, Yushi
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2015,
  • [43] Variational Graph Recurrent Neural Networks
    Hajiramezanali, Ehsan
    Hasanzadeh, Arman
    Duffield, Nick
    Narayanan, Krishna
    Zhou, Mingyuan
    Qian, Xiaoning
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [44] Learning Multivariate Hawkes Process via Graph Recurrent Neural Network
    Yoon, Kanghoon
    Im, Youngjun
    Choi, Jingyu
    Jeong, Taehwan
    Park, Jinkyoo
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5451 - 5462
  • [45] Graph Neural Network Meets Sparse Representation: Graph Sparse Neural Networks via Exclusive Group Lasso
    Jiang, Bo
    Wang, Beibei
    Chen, Si
    Tang, Jin
    Luo, Bin
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 12692 - 12698
  • [46] Molecular contrastive learning of representations via graph neural networks
    Yuyang Wang
    Jianren Wang
    Zhonglin Cao
    Amir Barati Farimani
    Nature Machine Intelligence, 2022, 4 : 279 - 287
  • [47] Dynamic Graph Representation Learning via Coupling-Process Model
    Duan, Pingtao
    Zhou, Chuan
    Liu, Yuting
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12383 - 12395
  • [48] Discovering Structural Hole Spanners in Dynamic Networks via Graph Neural Networks
    Goel, Diksha
    Shen, Hong
    Tian, Hui
    Guo, Mingyu
    2022 IEEE/WIC/ACM INTERNATIONAL JOINT CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY, WI-IAT, 2022, : 64 - 71
  • [49] Molecular contrastive learning of representations via graph neural networks
    Wang, Yuyang
    Wang, Jianren
    Cao, Zhonglin
    Farimani, Amir Barati
    NATURE MACHINE INTELLIGENCE, 2022, 4 (03) : 279 - 287
  • [50] Learning Stable Graph Neural Networks via Spectral Regularization
    Gao, Zhan
    Isufi, Elvin
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 361 - 365