IMCGNN: Information Maximization based Continual Graph Neural Networks for inductive node classification

被引:1
|
作者
Yuan, Qiao [1 ,2 ]
Guan, Sheng-Uei [2 ]
Luo, Tianlun [1 ,2 ]
Man, Ka Lok [2 ]
Lim, Eng Gee [2 ]
机构
[1] UNIV LIVERPOOL, LIVERPOOL L69 3BX, England
[2] Xian Jiaotong Liverpool Univ, Suzhou 215123, Peoples R China
关键词
Continual graph learning; Experience replay; Deep learning; HIPPOCAMPUS;
D O I
10.1016/j.neucom.2025.129362
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual graph learning is an emerging topic that enables models to incrementally acquire new knowledge while retaining prior experiences. It efficiently adapts to model evolving dynamic graphs, avoiding the computational burden of training from scratch. The key distinction of CGL from conventional continual learning is the interdependence of samples in graph-structured data versus the independence in conventional learning. Consequently, continual graph learning techniques should emphasize consolidating and leveraging the topological information in graph-structured data. Current methods inadequately address this need. Some approaches ignore topological information, resulting in significant information loss. Others attempt to preserve all learned information, leading to overly conservative models. Moreover, most of these methods employ graph neural networks (GNNs) as the base model, yet they fail to fully utilize the topological information learned by GNNs. Additionally, the majority of existing works focus on transductive setting, with inductive continual graph learning problems being scarcely explored. Our proposed Information Maximization based Continual Graph Neural Network (IMCGNN) focuses on inductive task-incremental node classification problems. This proposed work involves a replay module and a regularization module. The former extracts representative subgraphs from previous data, training them jointly with new data to retain historical experiences, whereas the latter preserves topological information and loss-related information with encoded knowledge by imposing elastic penalties on network parameters. Unlike heuristic node selection, our approach utilizes the information theory to guide node selection informing a subgraph, aiming to preserve information better. Comparative experiments with nine baselines using two graph learning models on five benchmark datasets demonstrate the effectiveness and efficiency of our method.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] GraphSMOTE: Imbalanced Node Classification on Graphs with Graph Neural Networks
    Zhao, Tianxiang
    Zhang, Xiang
    Wang, Suhang
    WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 833 - 841
  • [22] Graph Neural Networks with Information Anchors for Node Representation Learning
    Liu, Chao
    Li, Xinchuan
    Zhao, Dongyang
    Guo, Shaolong
    Kang, Xiaojun
    Dong, Lijun
    Yao, Hong
    MOBILE NETWORKS & APPLICATIONS, 2022, 27 (01): : 315 - 328
  • [23] Graph Neural Networks with Information Anchors for Node Representation Learning
    Chao Liu
    Xinchuan Li
    Dongyang Zhao
    Shaolong Guo
    Xiaojun Kang
    Lijun Dong
    Hong Yao
    Mobile Networks and Applications, 2022, 27 : 315 - 328
  • [24] Sparse Structure Learning via Graph Neural Networks for Inductive Document Classification
    Piao, Yinhua
    Lee, Sangseon
    Lee, Dohoon
    Kim, Sun
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11165 - 11173
  • [25] SLRNode: node similarity-based leading relationship representation layer in graph neural networks for node classification
    Xiang, Fuchuan
    Xiao, Yao
    Cen, Fenglin
    Xu, Ji
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (05):
  • [26] Explanation-based Graph Neural Networks for Graph Classification
    Seo, Sangwoo
    Jung, Seungjun
    Kim, Changick
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2836 - 2842
  • [27] Label as Equilibrium: A performance booster for Graph Neural Networks on node classification
    Luo, Yi
    Luo, Guangchun
    Duan, Guiduo
    Chen, Aiguo
    NEURAL NETWORKS, 2025, 186
  • [28] Disparity, Inequality, and Accuracy Tradeoffs in Graph Neural Networks for Node Classification
    Merchant, Arpit
    Castillo, Carlos
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 1818 - 1827
  • [29] Multi-view Heterogeneous Graph Neural Networks for Node Classification
    Zeng, Xi
    Lei, Fang-Yuan
    Wang, Chang-Dong
    Dai, Qing-Yun
    DATA SCIENCE AND ENGINEERING, 2024, 9 (03) : 294 - 308
  • [30] Global-local graph neural networks for node-classification
    Eliasof, Moshe
    Treister, Eran
    Pattern Recognition Letters, 184 : 103 - 110