Hierarchical Graph Neural Networks for Few-Shot Learning

被引:113
作者
Chen, Cen [1 ]
Li, Kenli [2 ]
Wei, Wei [3 ]
Zhou, Joey Tianyi [4 ]
Zeng, Zeng [1 ]
机构
[1] Infocomm Res Inst, Singapore 138632, Singapore
[2] Hunan Univ, Coll Informat Sci & Engn, Changsha 410082, Peoples R China
[3] Xian Univ Technol, Sch Comp Sci & Engn, Xian 710048, Peoples R China
[4] Inst High Performance Comp, Singapore 138632, Singapore
基金
中国国家自然科学基金;
关键词
Cognition; Feature extraction; Graph neural networks; Training; Task analysis; Deep learning; Predictive models; Few-shot learning; graph neural networks; hierarchical structure;
D O I
10.1109/TCSVT.2021.3058098
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recent graph neural network (GNN) based methods for few-shot learning (FSL) represent the samples of interest as a fully-connected graph and conduct reasoning on the nodes flatly, which ignores the hierarchical correlations among nodes. However, real-world categories may have hierarchical structures, and for FSL, it is important to extract the distinguishing features of the categories from individual samples. To explore this, we propose a novel hierarchical graph neural network (HGNN) for FSL, which consists of three parts, i.e., bottom-up reasoning, top-down reasoning, and skip connections, to enable the efficient learning of multi-level relationships. For the bottom-up reasoning, we design intra-class k-nearest neighbor pooling (intra-class knnPool) and inter-class knnPool layers, to conduct hierarchical learning for both the intra- and inter-class nodes. For the top-down reasoning, we propose to utilize graph unpooling (gUnpool) layers to restore the down-sampled graph into its original size. Skip connections are proposed to fuse multi-level features for the final node classification. The parameters of HGNN are learned by episodic training with the signal of node losses, which aims to train a well-generalizable model for recognizing unseen classes with few labeled data. Experimental results on benchmark datasets have demonstrated that HGNN outperforms other state-of-the-art GNN based methods significantly, for both transductive and non-transductive FSL tasks. The dataset as well as the source code can be downloaded online(1)
引用
收藏
页码:240 / 252
页数:13
相关论文
共 62 条
[1]  
[Anonymous], 2016, Advances in Neural Information Processing Systems
[2]  
[Anonymous], 2016, P ADV NEUR INF PROC
[3]  
Boski M, 2017, 2017 10TH INTERNATIONAL WORKSHOP ON MULTIDIMENSIONAL (ND) SYSTEMS (NDS)
[4]   Exploring Structural Knowledge for Automated Visual Inspection of Moving Trains [J].
Chen, Cen ;
Zou, Xiaofeng ;
Zeng, Zeng ;
Cheng, Zhongyao ;
Zhang, Le ;
Hoi, Steven C. H. .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (02) :1233-1246
[5]  
Chen W.-Y., 2019, PROC INT C LEARN REP
[6]  
Chen W.-Y., 2018, INT C LEARN REPR
[7]  
Cheng H, 2020, ARXIV200706878
[8]  
Deng J, 2014, LECT NOTES COMPUT SC, V8689, P48, DOI 10.1007/978-3-319-10590-1_4
[9]  
Finn C, 2017, PR MACH LEARN RES, V70
[10]  
Gao H., 2019, PR MACH LEARN RES, P2083