Graph Few-shot Learning with Task-specific Structures

被引:0
作者
Wang, Song [1 ]
Chen, Chen [1 ]
Li, Jundong [1 ]
机构
[1] Univ Virginia, Charlottesville, VA 22903 USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022 | 2022年
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph few-shot learning is of great importance among various graph learning tasks. Under the few-shot scenario, models are often required to conduct classification given limited labeled samples. Existing graph few-shot learning methods typically leverage Graph Neural Networks (GNNs) and perform classification across a series of meta-tasks. Nevertheless, these methods generally rely on the original graph (i.e., the graph that the meta-task is sampled from) to learn node representations. Consequently, the graph structure used in each meta-task is identical. Since the class sets are different across meta-tasks, node representations should be learned in a task-specific manner to promote classification performance. Therefore, to adaptively learn node representations across meta-tasks, we propose a novel framework that learns a task-specific structure for each meta-task. To handle the variety of nodes across meta-tasks, we extract relevant nodes and learn task-specific structures based on node influence and mutual information. In this way, we can learn node representations with the task-specific structure tailored for each meta-task. We further conduct extensive experiments on five node classification datasets under both single- and multiple-graph settings to validate the superiority of our framework over the state-of-the-art baselines. Our code is provided at https://github.com/SongW-SW/GLITTER.
引用
收藏
页数:12
相关论文
共 49 条
[1]  
Andreeva A., 2020, NUCL ACIDS RES
[2]  
[Anonymous], 2019, ICML
[3]  
[Anonymous], 2018, ICML
[4]  
Bojchevski A., 2018, ICLR
[5]   Protein function prediction via graph kernels [J].
Borgwardt, KM ;
Ong, CS ;
Schönauer, S ;
Vishwanathan, SVN ;
Smola, AJ ;
Kriegel, HP .
BIOINFORMATICS, 2005, 21 :I47-I56
[6]  
Cao SS, 2016, AAAI CONF ARTIF INTE, P1145
[7]  
Chang Shiyu, 2015, SIGKDD
[8]  
Chauhan Jatin, 2020, ICLR
[9]  
Chen Mingyang, 2019, EMNLP
[10]  
Chiang W.-L., 2019, SIGKDD