ScatterSample: Diversified Label Sampling for Data Efficient Graph Neural Network Learning

被引:0
作者
Dai, Zhenwei [1 ]
Ioannidis, Vasileios [2 ]
Adeshina, Soji [2 ]
Jost, Zak [2 ]
Faloutsos, Christos [3 ]
Karypis, George [2 ]
机构
[1] Rice Univ, Dept Stat, Houston, TX 77251 USA
[2] Amazon Web Serv, Seattle, WA USA
[3] Carnegie Mellon Univ, Dept Comp Sci, Pittsburgh, PA 15213 USA
来源
LEARNING ON GRAPHS CONFERENCE, VOL 198 | 2022年 / 198卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
What target labels are most effective for graph neural network (GNN) training? In some applications where GNNs excel-like drug design or fraud detection, labeling new instances is expensive. We develop a data-efficient active sampling framework, ScatterSample, to train GNNs under an active learning setting. ScatterSample employs a sampling module termed DiverseUncertainty to collect instances with large uncertainty from different regions of the sample space for labeling. To ensure diversification of the selected nodes, DiverseUncertainty clusters the high uncertainty nodes and selects the representative nodes from each cluster. Our ScatterSample algorithm is further supported by rigorous theoretical analysis demonstrating its advantage compared to standard active sampling methods that aim to simply maximize the uncertainty and not diversify the samples. In particular, we show that ScatterSample is able to efficiently reduce the model uncertainty over the whole sample space. Our experiments on five datasets show that ScatterSample significantly outperforms the other GNN active learning baselines, specifically it reduces the sampling cost by up to 50% while achieving the same test accuracy.
引用
收藏
页数:15
相关论文
共 23 条
[11]  
Hu WH, 2021, Arxiv, DOI arXiv:2005.00687
[12]   Tensor Graph Convolutional Networks for Multi-Relational and Robust Learning [J].
Ioannidis, Vassilis N. ;
Marques, Antonio G. ;
Giannakis, Georgios B. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 :6535-6546
[13]  
Kirsch A, 2019, ADV NEUR IN, V32
[14]  
Kipf TN, 2017, Arxiv, DOI arXiv:1609.02907
[15]  
OHAGAN A, 1978, J R STAT SOC B, V40, P1
[16]  
Seeger Matthias, 2004, Int J Neural Syst, V14, P69, DOI 10.1142/S0129065704001899
[17]  
Settles B., 2009, ACTIVE LEARNING LIT
[18]   Combining active and semi-supervised learning for spoken language understanding [J].
Tur, G ;
Hakkani-Tür, D ;
Schapire, RE .
SPEECH COMMUNICATION, 2005, 45 (02) :171-186
[19]  
Veličkovic P, 2018, Arxiv, DOI [arXiv:1710.10903, 10.48550/arXiv.1710.10903, DOI 10.48550/ARXIV.1710.10903]
[20]  
Wang MJ, 2020, Arxiv, DOI arXiv:1909.01315