Dynamic Representation Learning via Recurrent Graph Neural Networks

被引:11
|
作者
Zhang, Chun-Yang [1 ]
Yao, Zhi-Liang [1 ]
Yao, Hong-Yu [1 ]
Huang, Feng [2 ]
Chen, C. L. Philip [3 ]
机构
[1] Fuzhou Univ, Sch Comp & Data Sci, Fuzhou 350025, Peoples R China
[2] Fuzhou Univ, Sch Mech Engn & Automat, Fuzhou 350025, Peoples R China
[3] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Representation learning; Recurrent neural networks; Matrix decomposition; Feature extraction; Computational modeling; Biological system modeling; Task analysis; Dynamic graphs; graph neural networks (GNNs); graph representation learning (GRL); node embeddings; recurrent neural network (RNN); COMMUNITY;
D O I
10.1109/TSMC.2022.3196506
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A large number of real-world systems generate graphs that are structured data aligned with nodes and edges. Graphs are usually dynamic in many scenarios, where nodes or edges keep evolving over time. Recently, graph representation learning (GRL) has received great success in network analysis, which aims to produce informative and representative features or low-dimensional embeddings by exploring node attributes and network topology. Most state-of-the-art models for dynamic GRL are composed of a static representation learning model and a recurrent neural network (RNN). The former generates the representations of a graph or nodes from one static graph at a discrete time step, while the latter captures the temporal correlation between adjacent graphs. However, the two-stage design ignores the temporal dynamics between contiguous graphs during the learning processing of graph representations. To alleviate this problem, this article proposes a representation learning model for dynamic graphs, called DynGNN. Differently, it is a single-stage model that embeds an RNN into a graph neural network to produce better representations in a compact form. This takes the fusion of temporal and topology correlations into account from low-level to high-level feature learning, enabling the model to capture more fine-grained evolving patterns. From the experimental results on both synthetic and real-world networks, the proposed DynGNN yields significant improvements in multiple tasks compared to the state-of-the-art counterparts.
引用
收藏
页码:1284 / 1297
页数:14
相关论文
共 50 条
  • [31] Two pages graph layout via recurrent multivalued neural networks
    Lopez-Rodriguez, Domingo
    Merida-Casermeiro, Enrique
    Ortiz-de-Lazcano-Lobato, Juan M.
    Galan-Marin, Gloria
    COMPUTATIONAL AND AMBIENT INTELLIGENCE, 2007, 4507 : 194 - +
  • [32] GAN-based deep neural networks for graph representation learning
    Zhao, Ming
    Zhang, Yinglong
    ENGINEERING REPORTS, 2022, 4 (11)
  • [33] Rethinking Higher-order Representation Learning with Graph Neural Networks
    Xu, Tuo
    Zou, Lei
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [34] Learning Effective Road Network Representation with Hierarchical Graph Neural Networks
    Wu, Ning
    Zhao, Wayne Xin
    Wang, Jingyuan
    Pan, Dayan
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 6 - 14
  • [35] Hierarchical Representation Learning in Graph Neural Networks With Node Decimation Pooling
    Bianchi, Filippo Maria
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 2195 - 2207
  • [36] Information Diffusion Prediction via Dynamic Graph Neural Networks
    Cao, Zongmai
    Han, Kai
    Zhu, Jianfu
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 1099 - 1104
  • [37] DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks
    Sankar, Aravind
    Wu, Yanhong
    Gou, Liang
    Zhang, Wei
    Yang, Hao
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 519 - 527
  • [38] Maneuver Prediction Using Traffic Scene Graphs via Graph Neural Networks and Recurrent Neural Networks
    Rama, Petrit
    Bajcinca, Naim
    INTERNATIONAL JOURNAL OF SEMANTIC COMPUTING, 2023, 17 (03) : 349 - 370
  • [39] A Novel Representation Learning for Dynamic Graphs Based on Graph Convolutional Networks
    Gao, Chao
    Zhu, Junyou
    Zhang, Fan
    Wang, Zhen
    Li, Xuelong
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) : 3599 - 3612
  • [40] Generative Representation Learning in Recurrent Neural Networks for Causal Timeseries Forecasting
    Chatziparaskevas, Georgios
    Mademlis, Ioannis
    Pitas, Ioannis
    IEEE Transactions on Artificial Intelligence, 2024, 5 (12): : 6412 - 6425