Dynamic Representation Learning via Recurrent Graph Neural Networks

被引:11
|
作者
Zhang, Chun-Yang [1 ]
Yao, Zhi-Liang [1 ]
Yao, Hong-Yu [1 ]
Huang, Feng [2 ]
Chen, C. L. Philip [3 ]
机构
[1] Fuzhou Univ, Sch Comp & Data Sci, Fuzhou 350025, Peoples R China
[2] Fuzhou Univ, Sch Mech Engn & Automat, Fuzhou 350025, Peoples R China
[3] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Representation learning; Recurrent neural networks; Matrix decomposition; Feature extraction; Computational modeling; Biological system modeling; Task analysis; Dynamic graphs; graph neural networks (GNNs); graph representation learning (GRL); node embeddings; recurrent neural network (RNN); COMMUNITY;
D O I
10.1109/TSMC.2022.3196506
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A large number of real-world systems generate graphs that are structured data aligned with nodes and edges. Graphs are usually dynamic in many scenarios, where nodes or edges keep evolving over time. Recently, graph representation learning (GRL) has received great success in network analysis, which aims to produce informative and representative features or low-dimensional embeddings by exploring node attributes and network topology. Most state-of-the-art models for dynamic GRL are composed of a static representation learning model and a recurrent neural network (RNN). The former generates the representations of a graph or nodes from one static graph at a discrete time step, while the latter captures the temporal correlation between adjacent graphs. However, the two-stage design ignores the temporal dynamics between contiguous graphs during the learning processing of graph representations. To alleviate this problem, this article proposes a representation learning model for dynamic graphs, called DynGNN. Differently, it is a single-stage model that embeds an RNN into a graph neural network to produce better representations in a compact form. This takes the fusion of temporal and topology correlations into account from low-level to high-level feature learning, enabling the model to capture more fine-grained evolving patterns. From the experimental results on both synthetic and real-world networks, the proposed DynGNN yields significant improvements in multiple tasks compared to the state-of-the-art counterparts.
引用
收藏
页码:1284 / 1297
页数:14
相关论文
共 50 条
  • [11] Graph partitioning via recurrent multivalued neural networks
    Mérida-Casermeiro, E
    López-Rodríguez, D
    COMPUTATIONAL INTELLIGENCE AND BIOINSPIRED SYSTEMS, PROCEEDINGS, 2005, 3512 : 1149 - 1156
  • [12] Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology
    Dehmamy, Nima
    Barabasi, Albert-Laszlo
    Yu, Rose
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [13] Temporal Aggregation and Propagation Graph Neural Networks for Dynamic Representation
    Zheng, Tongya
    Wang, Xinchao
    Feng, Zunlei
    Song, Jie
    Hao, Yunzhi
    Song, Mingli
    Wang, Xingen
    Wang, Xinyu
    Chen, Chun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10151 - 10165
  • [14] DyVGRNN: DYnamic mixture Variational Graph Recurrent Neural Networks
    Niknam, Ghazaleh
    Molaei, Soheila
    Zare, Hadi
    Pan, Shirui
    Jalili, Mahdi
    Zhu, Tingting
    Clifton, David
    NEURAL NETWORKS, 2023, 165 : 596 - 610
  • [15] Interpretable Clustering on Dynamic Graphs with Recurrent Graph Neural Networks
    Yao, Yuhang
    Joe-Wong, Carlee
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4608 - 4616
  • [16] Graph Neural Networks with Information Anchors for Node Representation Learning
    Liu, Chao
    Li, Xinchuan
    Zhao, Dongyang
    Guo, Shaolong
    Kang, Xiaojun
    Dong, Lijun
    Yao, Hong
    MOBILE NETWORKS & APPLICATIONS, 2022, 27 (01): : 315 - 328
  • [17] Graph Neural Networks with Information Anchors for Node Representation Learning
    Chao Liu
    Xinchuan Li
    Dongyang Zhao
    Shaolong Guo
    Xiaojun Kang
    Lijun Dong
    Hong Yao
    Mobile Networks and Applications, 2022, 27 : 315 - 328
  • [18] Representation Learning of Histopathology Images using Graph Neural Networks
    Adnan, Mohammed
    Kalra, Shivam
    Tizhoosh, Hamid R.
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 4254 - 4261
  • [19] Robust Graph Neural Networks via Ensemble Learning
    Lin, Qi
    Yu, Shuo
    Sun, Ke
    Zhao, Wenhong
    Alfarraj, Osama
    Tolba, Amr
    Xia, Feng
    MATHEMATICS, 2022, 10 (08)
  • [20] Streaming Graph Neural Networks via Continual Learning
    Wang, Junshan
    Song, Guojie
    Wu, Yi
    Wang, Liang
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 1515 - 1524