Adaptively Denoising Graph Neural Networks for Knowledge Distillation

被引:0
|
作者
Guo, Yuxin [1 ]
Yang, Cheng [1 ]
Shi, Chuan [1 ]
Tu, Ke [2 ]
Wu, Zhengwei [2 ]
Zhang, Zhiqiang [2 ]
Zhou, Jun [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing, Peoples R China
[2] Ant Financial, Hangzhou, Peoples R China
来源
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024 | 2024年 / 14948卷
基金
中国国家自然科学基金;
关键词
Graph Neural Networks; Knowledge Distillation;
D O I
10.1007/978-3-031-70371-3_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) have excelled in various graph-based applications. Recently, knowledge distillation (KD) has provided a new approach to further boost GNNs performance. However, in the KD process, the GNN student may encounter noise issues while learning from GNN teacher and input graph. GNN teachers may carry noise as deep models inevitably introduce noise during training, leading to error propagation in GNN students. Besides, noisy structures in input graph may also disrupt information during message-passing in GNNs. Hence, we propose DKDG to adaptively remove noise in GNN teacher and graph structure for better distillation. DKDG comprises two modules: (1) teacher knowledge denoising module, which separates GNN teacher knowledge into noise and label knowledge, and removes parameters fitting noise knowledge in the GNN student. (2) graph structure denoising module is designed to enhance node representations discrimination. Detailly, we propose a discrimination-preserving objective based on total variation loss and update edge weights between adjacent nodes to minimize this objective. These two modules are integrated through GNN's forward propagation and trained iteratively. Experiments on five benchmark datasets and three GNNs demonstrate the GNN student distilled by DKDG gains 1.86% relative improvement compared to the best baseline of recent state-of-the-art GNN-based KD methods.
引用
收藏
页码:253 / 269
页数:17
相关论文
共 50 条
  • [21] Knowledge-Aware Dual-Channel Graph Neural Networks For Denoising Recommendation
    Zhang, Hanwen
    Wang, Li-e
    Sun, Zhigang
    Li, Xianxian
    COMPUTER JOURNAL, 2023, 67 (05): : 1607 - 1618
  • [22] Knowledge-Aware Dual-Channel Graph Neural Networks For Denoising Recommendation
    Zhang, Hanwen
    Wang, Lie
    Sun, Zhigang
    Li, Xianxian
    Computer Journal, 2024, 67 (05): : 1607 - 1618
  • [23] A Unified View on Graph Neural Networks as Graph Signal Denoising
    Ma, Yao
    Liu, Xiaorui
    Zhao, Tong
    Liu, Yozen
    Tang, Jiliang
    Shah, Neil
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 1202 - 1211
  • [24] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Jiongyu Guo
    Defang Chen
    Can Wang
    Neural Computing and Applications, 2023, 35 : 22359 - 22374
  • [25] The Devil is in the Data: Learning Fair Graph Neural Networks via Partial Knowledge Distillation
    Zhu, Yuchang
    Li, Jintang
    Chen, Liang
    Zheng, Zibin
    PROCEEDINGS OF THE 17TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, WSDM 2024, 2024, : 1012 - 1021
  • [26] Double Wins: Boosting Accuracy and Fifficiency of Graph Neural Networks by Reliable Knowledge Distillation
    Tan, Qiaoyu
    Zhu, Daochen
    Liu, Ninghao
    Choi, Soo-Hyun
    Li, Li
    Chen, Rui
    Hu, Xia
    23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1343 - 1348
  • [27] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Guo, Jiongyu
    Chen, Defang
    Wang, Can
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30): : 22359 - 22374
  • [28] Protein Engineering with Lightweight Graph Denoising Neural Networks
    Zhou, Bingxin
    Zheng, Lirong
    Wu, Banghao
    Tan, Yang
    Lv, Outongyi
    Yi, Kai
    Fan, Guisheng
    Hong, Liang
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2024, 64 (09) : 3650 - 3661
  • [29] IMAGE DENOISING WITH GRAPH-CONVOLUTIONAL NEURAL NETWORKS
    Valsesia, Diego
    Fracastoro, Giulia
    Magli, Enrico
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 2399 - 2403
  • [30] Fine-Grained Learning Behavior-Oriented Knowledge Distillation for Graph Neural Networks
    Liu, Kang
    Huang, Zhenhua
    Wang, Chang-Dong
    Gao, Beibei
    Chen, Yunwen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,