Long-tailed graph neural networks via graph structure learning for node classification

被引:3
|
作者
Lin, Junchao [1 ]
Wan, Yuan [1 ]
Xu, Jingwen [1 ]
Qi, Xingchen [2 ]
机构
[1] Wuhan Univ Technol, Coll Sci, 122 Luoshi Rd, Wuhan 430070, Hubei, Peoples R China
[2] Univ Texas Austin, Dept Elect & Comp Engn, 1616 Guadalupe St,Suite 4-202, Austin, TX 78701 USA
关键词
Graph neural networks; Graph perturbation; Tail node embedding enhancement; Graph structure learning;
D O I
10.1007/s10489-023-04534-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Long-tailed methods have gained increasing attention and achieved excellent performance due to the long-tailed distribution in graphs, i.e., many small-degree tail nodes have limited structural connectivity. However, real-world graphs are inevitably noisy or incomplete due to error-prone data acquisition or perturbations, which may violate the assumption that the raw graph structure is ideal for long-tailed methods. To address this issue, we study the impact of graph perturbation on the performance of long-tailed methods, and propose a novel GNN-based framework called LTSL-GNN for graph structure learning and tail node embedding enhancement. LTSL-GNN iteratively learns the graph structure and tail node embedding enhancement parameters, allowing information-rich head nodes to optimize the graph structure through multi-metric learning and further enhancing the embeddings of the tail nodes with the learned graph structure. Experimental results on six real-world datasets demonstrate that LTSL-GNN outperforms other state-of-the-art baselines, especially when the graph structure is disturbed.
引用
收藏
页码:20206 / 20222
页数:17
相关论文
empty
未找到相关数据