Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks

被引:2
作者
Guo, Jiongyu [1 ]
Chen, Defang [1 ]
Wang, Can [1 ]
机构
[1] Zhejiang Univ, ZJU Bangsun Joint Res Ctr, Shanghai Inst Adv Study, Hangzhou, Zhejiang, Peoples R China
来源
2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2022年
基金
中国国家自然科学基金;
关键词
Online Knowledge Distillation; Graph Neural Networks; Cross-Layer Alignment;
D O I
10.1109/IJCNN55064.2022.9892159
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing knowledge distillation methods on graph neural networks (GNNs) are almost offline, where the student model extracts knowledge from a powerful teacher model to improve its performance. However, a pre-trained teacher model is not always accessible due to training cost, privacy, etc. In this paper, we propose a novel online knowledge distillation framework to resolve this problem. Specifically, each student GNN model learns the extracted local structure from another simultaneously trained counterpart in an alternating training procedure. We further develop a cross-layer distillation strategy by aligning ahead one student layer with the layer in different depth of another student model, which theoretically makes the structure information spread over all layers. Experimental results on five datasets including PPI, Coauthor-CS/Physics and Amazon-Computer/Photo demonstrate that the student performance is consistently boosted in our collaborative training framework without the supervision of a pre-trained teacher model. In addition, we also find that our alignahead technique can accelerate the model convergence speed and its effectiveness can be generally improved by increasing the student numbers in training. Code is available: https://github.com/GuoJY-eatsTG/Alignahead
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Jiongyu Guo
    Defang Chen
    Can Wang
    Neural Computing and Applications, 2023, 35 : 22359 - 22374
  • [2] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Guo, Jiongyu
    Chen, Defang
    Wang, Can
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30) : 22359 - 22374
  • [3] Online adversarial knowledge distillation for graph neural networks
    Wang, Can
    Wang, Zhe
    Chen, Defang
    Zhou, Sheng
    Feng, Yan
    Chen, Chun
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [4] Deep Cross-Layer Collaborative Learning Network for Online Knowledge Distillation
    Su, Tongtong
    Liang, Qiyu
    Zhang, Jinsong
    Yu, Zhaoyang
    Xu, Ziyue
    Wang, Gang
    Liu, Xiaoguang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (05) : 2075 - 2087
  • [5] Cross-Layer Alarm Association Rules Discovery of Cloud-Network based on Knowledge Graph
    Zhao, Huiying
    Li, Hongwu
    Wu, Bin
    Liu, Ruiqi
    Xu, Lexi
    Huang, Bingming
    Xie, Zhipu
    Cheng, Xinzhou
    20TH INTERNATIONAL WIRELESS COMMUNICATIONS & MOBILE COMPUTING CONFERENCE, IWCMC 2024, 2024, : 400 - 405
  • [6] Feature Selection and Extraction for Graph Neural Networks
    Acharya, Deepak Bhaskar
    Zhang, Huaming
    ACMSE 2020: PROCEEDINGS OF THE 2020 ACM SOUTHEAST CONFERENCE, 2020, : 252 - 255
  • [7] On Representation Knowledge Distillation for Graph Neural Networks
    Joshi, Chaitanya K.
    Liu, Fayao
    Xun, Xu
    Lin, Jie
    Foo, Chuan Sheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4656 - 4667
  • [8] RELIANT: Fair Knowledge Distillation for Graph Neural Networks
    Dong, Yushun
    Zhang, Binchi
    Yuan, Yiling
    Zou, Na
    Wang, Qi
    Li, Jundong
    PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
  • [9] KAFNN: A Knowledge Augmentation Framework to Graph Neural Networks
    Tang, Bisheng
    Chen, Xiaojun
    Wang, Dakui
    Zhao, Zhendong
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [10] A Comprehensive Survey of Graph Neural Networks for Knowledge Graphs
    Ye, Zi
    Kumar, Yogan Jaya
    Sing, Goh Ong
    Song, Fengyan
    Wang, Junsong
    IEEE ACCESS, 2022, 10 : 75729 - 75741