Towards Few-Shot Self-explaining Graph Neural Networks

被引:0
|
作者
Peng, Jingyu [1 ]
Liu, Qi [1 ,2 ]
Yue, Linan [1 ]
Zhang, Zaixi [1 ]
Zhang, Kai [1 ]
Sha, Yunhao [1 ]
机构
[1] Univ Sci & Technol China, State Key Lab Cognit Intelligence, Hefei, Peoples R China
[2] Hefei Comprehens Natl Sci Ctr, Inst Artificial Intelligence, Hefei, Peoples R China
来源
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK, PT VI, ECML PKDD 2024 | 2024年 / 14946卷
关键词
Explainability; Graph Neural Network; Meta Learning;
D O I
10.1007/978-3-031-70365-2_7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent advancements in Graph Neural Networks (GNNs) have spurred an upsurge of research dedicated to enhancing the explainability of GNNs, particularly in critical domains such as medicine. A promising approach is the self-explaining method, which outputs explanations along with predictions. However, existing self-explaining models require a large amount of training data, rendering them unavailable in few-shot scenarios. To address this challenge, in this paper, we propose a Meta-learned Self-Explaining GNN (MSE-GNN), a novel framework that generates explanations to support predictions in few-shot settings. MSE-GNN adopts a two-stage self-explaining structure, consisting of an explainer and a predictor. Specifically, the explainer first imitates the attention mechanism of humans to select the explanation subgraph, whereby attention is naturally paid to regions containing important characteristics. Subsequently, the predictor mimics the decision-making process, which makes predictions based on the generated explanation. Moreover, with a novel meta-training process and a designed mechanism that exploits task information, MSE-GNN can achieve remarkable performance on new few-shot tasks. Extensive experimental results on four datasets demonstrate that MSE-GNN can achieve superior performance on prediction tasks while generating high-quality explanations compared with existing methods. The code is publicly available at https://github.com/jypeng28/MSE-GNN.
引用
收藏
页码:109 / 126
页数:18
相关论文
共 50 条
  • [1] Graph Neural Networks With Triple Attention for Few-Shot Learning
    Cheng, Hao
    Zhou, Joey Tianyi
    Tay, Wee Peng
    Wen, Bihan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 8225 - 8239
  • [2] Research Progress of Few-Shot Learning Methods Based on Graph Neural Networks
    Yang J.
    Dong Y.
    Qian J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (04): : 856 - 876
  • [3] Combining graph neural networks and transformers for few-shot nuclear receptor binding activity prediction
    Torres, Luis H. M.
    Arrais, Joel P.
    Ribeiro, Bernardete
    JOURNAL OF CHEMINFORMATICS, 2024, 16 (01):
  • [4] Adaptive Transfer of Graph Neural Networks for Few-Shot Molecular Property Prediction
    Zhang, Baoquan
    Luo, Chuyao
    Jiang, Hao
    Feng, Shanshan
    Li, Xutao
    Zhang, Bowen
    Ye, Yunming
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (06) : 3863 - 3875
  • [5] Concept-Oriented Self-Explaining Neural Networks
    Min Sue Park
    Hyung Ju Hwang
    Neural Processing Letters, 2023, 55 : 10873 - 10904
  • [6] Concept-Oriented Self-Explaining Neural Networks
    Park, Min Sue
    Hwang, Hyung Ju
    NEURAL PROCESSING LETTERS, 2023, 55 (08) : 10873 - 10904
  • [7] Local feature graph neural network for few-shot learning
    Weng P.
    Dong S.
    Ren L.
    Zou K.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (04) : 4343 - 4354
  • [8] An Overview of Deep Neural Networks for Few-Shot Learning
    Zhao, Juan
    Kong, Lili
    Lv, Jiancheng
    BIG DATA MINING AND ANALYTICS, 2025, 8 (01): : 145 - 188
  • [9] A unified transductive and inductive learning framework for Few-Shot Learning using Graph Neural Networks
    Chang, Jie
    Ren, Haodong
    Li, Zuoyong
    Xu, Yinlong
    Lai, Taotao
    APPLIED SOFT COMPUTING, 2025, 173
  • [10] Dongba Painting Few-Shot Classification Based on Graph Neural Network
    Li K.
    Qian W.
    Wang C.
    Xu D.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2021, 33 (07): : 1073 - 1083