Hierarchical Graph Neural Networks for Few-Shot Learning

被引:120
作者
Chen, Cen [1 ]
Li, Kenli [2 ]
Wei, Wei [3 ]
Zhou, Joey Tianyi [4 ]
Zeng, Zeng [1 ]
机构
[1] Infocomm Res Inst, Singapore 138632, Singapore
[2] Hunan Univ, Coll Informat Sci & Engn, Changsha 410082, Peoples R China
[3] Xian Univ Technol, Sch Comp Sci & Engn, Xian 710048, Peoples R China
[4] Inst High Performance Comp, Singapore 138632, Singapore
基金
中国国家自然科学基金;
关键词
Cognition; Feature extraction; Graph neural networks; Training; Task analysis; Deep learning; Predictive models; Few-shot learning; graph neural networks; hierarchical structure;
D O I
10.1109/TCSVT.2021.3058098
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recent graph neural network (GNN) based methods for few-shot learning (FSL) represent the samples of interest as a fully-connected graph and conduct reasoning on the nodes flatly, which ignores the hierarchical correlations among nodes. However, real-world categories may have hierarchical structures, and for FSL, it is important to extract the distinguishing features of the categories from individual samples. To explore this, we propose a novel hierarchical graph neural network (HGNN) for FSL, which consists of three parts, i.e., bottom-up reasoning, top-down reasoning, and skip connections, to enable the efficient learning of multi-level relationships. For the bottom-up reasoning, we design intra-class k-nearest neighbor pooling (intra-class knnPool) and inter-class knnPool layers, to conduct hierarchical learning for both the intra- and inter-class nodes. For the top-down reasoning, we propose to utilize graph unpooling (gUnpool) layers to restore the down-sampled graph into its original size. Skip connections are proposed to fuse multi-level features for the final node classification. The parameters of HGNN are learned by episodic training with the signal of node losses, which aims to train a well-generalizable model for recognizing unseen classes with few labeled data. Experimental results on benchmark datasets have demonstrated that HGNN outperforms other state-of-the-art GNN based methods significantly, for both transductive and non-transductive FSL tasks. The dataset as well as the source code can be downloaded online(1)
引用
收藏
页码:240 / 252
页数:13
相关论文
共 62 条
[21]   Transferrable Feature and Projection Learning with Class Hierarchy for Zero-Shot Learning [J].
Li, Aoxue ;
Lu, Zhiwu ;
Guan, Jiechao ;
Xiang, Tao ;
Wang, Liwei ;
Wen, Ji-Rong .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2020, 128 (12) :2810-2827
[22]   Large-Scale Few-Shot Learning: Knowledge Transfer With Class Hierarchy [J].
Li, Aoxue ;
Luo, Tiange ;
Lu, Zhiwu ;
Xiang, Tao ;
Wang, Liwei .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :7205-7213
[23]   DeepGCNs: Can GCNs Go as Deep as CNNs? [J].
Li, Guohao ;
Mueller, Matthias ;
Thabet, Ali ;
Ghanem, Bernard .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :9266-9275
[24]   Finding Task-Relevant Features for Few-Shot Learning by Category Traversal [J].
Li, Hongyang ;
Eigen, David ;
Dodge, Samuel ;
Zeiler, Matthew ;
Wang, Xiaogang .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :1-10
[25]  
Li QM, 2018, AAAI CONF ARTIF INTE, P3538
[26]  
Li WB, 2019, AAAI CONF ARTIF INTE, P8642
[27]  
Li Z., 2017, CoRR abs/1712.00960
[28]   Prototype Rectification for Few-Shot Learning [J].
Liu, Jinlu ;
Song, Liang ;
Qin, Yongqiang .
COMPUTER VISION - ECCV 2020, PT I, 2020, 12346 :741-756
[29]  
Liu Lihong, 2020, bioRxiv, DOI 10.1101/2020.06.17.153486
[30]  
Liu Y. -C., 2019, PROC INT C LEARN REP