Generalized Few-Shot Node Classification With Graph Knowledge Distillation

被引:2
作者
Wang, Jialong [1 ]
Zhou, Mengting [1 ]
Zhang, Shilong [1 ]
Gong, Zhiguo [1 ]
机构
[1] Univ Macau, State Key Lab Internet Things Smart City, Taipa 999078, Macao, Peoples R China
关键词
Adaptation models; Training; Metalearning; Task analysis; Data models; Manifolds; Graph neural networks; Few-shot learning (FSL); graph neural networks (GNNs); knowledge distillation; node classification;
D O I
10.1109/TCSS.2024.3382471
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Generalized few-shot node classification (GFS-NC) is a very important challenge for graph-based algorithms, as it requires to identify novel classes and base classes simultaneously. Although there are several methods that try to combine metalearning or metric learning with graph neural networks to solve few-shot problem, most of them assume that test samples only come from the novel classes, which is impractical in reality. Besides, they overlook the relationship among classes, which can provide additional information for the novel classes classification. In this article, we propose a graph-based knowledge distillation network (GraphKD) to extract the class relationship and learn better nodes representations for nodes from novel classes in GFS-NC task. GraphKD consists of two modules: balanced pretraining module and class-relation transferring module. Balanced pretraining can optimize network parameters to a suitable manifold for subsequent initialization. The class-relation transferring module leverages a knowledge distillation model, where a teacher model generates soft labels containing interclass relationships and then transfer them to the student model. The student model is optimized to fit both the soft labels and hard labels concurrently. This relationship information can help the student model better understand the similarities and differences between classes, thereby improving its classification performance. In addition, we employee information entropy to distinguish the samples locate at the boundary of a base class and novel class and then assign them larger weights in the student model to enhance its expressive capacity for novel nodes. Our experiments show that the proposed method outperforms state-of-the-art baselines on various few-shot node classification datasets.
引用
收藏
页码:1 / 11
页数:11
相关论文
共 34 条
[1]   Deep Metric Learning for Open World Semantic Segmentation [J].
Cen, Jun ;
Yun, Peng ;
Cai, Junhao ;
Wang, Michael Yu ;
Liu, Ming .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :15313-15322
[2]   Zero-shot Micro-video Classification with Neural Variational Inference in Graph Prototype Network [J].
Chen, Junyang ;
Wang, Jialong ;
Dai, Zhijiang ;
Wu, Huisi ;
Wang, Mengzhu ;
Zhang, Qin ;
Wang, Huan .
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, :966-974
[3]   A Simple Yet Effective Layered Loss for Pre-Training of Network Embedding [J].
Chen, Junyang ;
Li, Xueliang ;
Li, Yuanman ;
Li, Paul ;
Wang, Mengzhu ;
Zhang, Xiang ;
Gong, Zhiguo ;
Wu, Kaishun ;
Leung, Victor C. M. .
IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (03) :1827-1837
[4]   Adversarial Caching Training: Unsupervised Inductive Network Representation Learning on Large-Scale Graphs [J].
Chen, Junyang ;
Gong, Zhiguo ;
Wang, Wei ;
Wang, Cong ;
Xu, Zhenghua ;
Lv, Jianming ;
Li, Xueliang ;
Wu, Kaishun ;
Liu, Weiwen .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) :7079-7090
[5]   Self-Training Enhanced: Network Embedding and Overlapping Community Detection With Adversarial Learning [J].
Chen, Junyang ;
Gong, Zhiguo ;
Mo, Jiqian ;
Wang, Wei ;
Wang, Cong ;
Dong, Xiao ;
Li, Weiwen ;
Wu, Kaishun .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) :6737-6748
[6]  
Chen YD, 2021, PR MACH LEARN RES, V140, P17
[7]  
Defferrard M, 2016, ADV NEUR IN, V29
[8]   Graph Prototypical Networks for Few-shot Learning on Attributed Networks [J].
Ding, Kaize ;
Wang, Jianling ;
Li, Jundong ;
Shu, Kai ;
Liu, Chenghao ;
Liu, Huan .
CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, :295-304
[9]  
Finn C, 2017, PR MACH LEARN RES, V70
[10]  
Hamilton WL, 2017, ADV NEUR IN, V30