Balancing Augmentation With Edge Utility Filter for Signed Graph Neural Networks

被引:0
作者
Chen, Ke-Jia [1 ,2 ,3 ]
Ji, Yaming [2 ]
Mu, Wenhui [2 ]
Qu, Youran [2 ]
机构
[1] Nanjing Univ Posts & Telecommun, Jiangsu Key Lab Big Data Secur &Intelligent Proc, Nanjing 210023, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Nanjing Univ Posts & Telecommun, Sch Comp Sci, Jiangsu Key Lab Big Data Secur &Intelligent Proc, Nanjing 210023, Peoples R China
[3] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing 210093, Peoples R China
来源
IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING | 2024年 / 11卷 / 06期
基金
中国国家自然科学基金;
关键词
Graph neural networks; Semantics; Perturbation methods; Noise; Data augmentation; Vectors; Telecommunications; Regulators; Filtering theory; Deep learning; Graph augmentation; graph embedding; link prediction; signed netwok; unbalanced structure;
D O I
10.1109/TNSE.2024.3475379
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Many real-world networks are signed networks containing positive and negative edges. The existence of negative edges in the signed graph neural network has two consequences. One is the semantic imbalance, as the negative edges are hard to obtain though they may potentially include more useful information. The other is the structural unbalance, e.g., unbalanced triangles, an indication of incompatible relationship among nodes. This paper proposes a balancing augmentation to address the two challenges. Firstly, the utility of each negative edge is determined by calculating its occurrence in balanced structures. Secondly, the original signed graph is selectively augmented with the use of (1) an edge perturbation regulator to balance the number of positive and negative edges and to determine the ratio of perturbed edges and (2) an edge utility filter to remove the negative edges with low utility. Finally, a signed graph neural network is trained on the augmented graph. The theoretical analysis is conducted to prove the effectiveness of each module and the experiments demonstrate that the proposed method can significantly improve the performance of three backbone models in link sign prediction task, with up to 22.8% in the AUC and 19.7% in F1 scores, across five real-world datasets.
引用
收藏
页码:5903 / 5915
页数:13
相关论文
共 50 条
  • [41] Towards transferable metamodels for water distribution systems with edge-based graph neural networks
    Kerimov, Bulat
    Taormina, Riccardo
    Tscheikner-Gratl, Franz
    WATER RESEARCH, 2024, 261
  • [42] Edge-based sequential graph generation with recurrent neural networks
    Bacciu, Davide
    Micheli, Alessio
    Podda, Marco
    NEUROCOMPUTING, 2020, 416 : 177 - 189
  • [43] Parallelizing Graph Neural Networks via Matrix Compaction for Edge-Conditioned Networks
    Zaman, Shehtab
    Moon, Tim
    Benson, Tom
    Jacobs, Sam Ade
    Chiu, Kenneth
    Van Essen, Brian
    2022 22ND IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING (CCGRID 2022), 2022, : 386 - 395
  • [44] Secure and Efficient Coded Multi-Access Edge Computing With Generalized Graph Neural Networks
    Asheralieva, Alia
    Niyato, Dusit
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (09) : 5504 - 5524
  • [45] Heterogeneous graph neural networks with denoising for graph embeddings
    Dong, Xinrui
    Zhang, Yijia
    Pang, Kuo
    Chen, Fei
    Lu, Mingyu
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [46] Cybersecurity Knowledge Graph Improvement with Graph Neural Networks
    Dasgupta, Soham
    Piplai, Aritran
    Ranade, Priyanka
    Joshi, Anupam
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 3290 - 3297
  • [47] Graph Neural Networks With Adaptive Structures
    Zhang, Zepeng
    Lu, Songtao
    Huang, Zengfeng
    Zhao, Ziping
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2025, 19 (01) : 181 - 194
  • [48] Towards Deeper Graph Neural Networks
    Liu, Meng
    Gao, Hongyang
    Ji, Shuiwang
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 338 - 348
  • [49] Improving Expressivity of Graph Neural Networks
    Purgal, Stanislaw J.
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [50] Temporal Aggregation and Propagation Graph Neural Networks for Dynamic Representation
    Zheng, Tongya
    Wang, Xinchao
    Feng, Zunlei
    Song, Jie
    Hao, Yunzhi
    Song, Mingli
    Wang, Xingen
    Wang, Xinyu
    Chen, Chun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10151 - 10165