Signed Graph Neural Network with Latent Groups

被引:17
作者
Liu, Haoxin [1 ]
Zhang, Ziwei [1 ]
Cui, Peng [1 ]
Zhang, Yafeng [2 ]
Cui, Qiang [2 ]
Liu, Jiashuo [1 ]
Zhu, Wenwu [1 ]
机构
[1] Tsinghua Univ, Beijing, Peoples R China
[2] Meituan, Beijing, Peoples R China
来源
KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING | 2021年
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Signed Graphs; Graph Neural Networks; Balance Theory;
D O I
10.1145/3447548.3467355
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Signed graph representation learning is an effective approach to analyze the complex patterns in real-world signed graphs with the co-existence of positive and negative links. Most previous signed graph representation learning methods resort to balance theory, a classic social theory that originated from psychology as the core assumption. However, since balance theory is shown equivalent to a simple assumption that nodes can be divided into two conflicting groups, it fails to model the structure of real signed graphs. To solve this problem, we propose Group Signed Graph Neural Network (GS-GNN) model for signed graph representation learning beyond the balance theory assumption. GS-GNN has a dual GNN architecture that consists of the global and the local module. In the global module, we adopt a more generalized assumption that nodes can be divided into multiple latent groups and that the groups can have arbitrary relations and propose a novel prototype-based GNN to learn node representations based on the assumption. In the local module, to give the model enough flexibility in modeling other factors, we do not make any prior assumptions, treat positive links and negative links as two independent relations, and adopt a relational GNN to learn node representations. Both modules can complement each other, and the concatenation of two modules is fed into downstream tasks. Extensive experimental results demonstrate the effectiveness of our GS-GNN model on both synthetic and real-world signed graphs by greatly and consistently outperforming all the baselines and achieving new state-of-the-art results. Our implementation is available in PyTorch(1).
引用
收藏
页码:1066 / 1075
页数:10
相关论文
共 37 条
  • [1] [Anonymous], 2014, P 20 ACM SIGKDD INT, DOI [DOI 10.1145/2623330.2623732, 10 . 1145 / 2623330 . 2623732. arXiv: 1403.6652]
  • [2] [Anonymous], 2017, ARXIV170101394
  • [3] Cao SS, 2016, AAAI CONF ARTIF INTE, P1145
  • [4] Chiang K., 2012, CIKM
  • [5] Cucuringu M, 2019, PR MACH LEARN RES, V89
  • [6] Signed Graph Convolutional Networks
    Derr, Tyler
    Ma, Yao
    Tang, Jiliang
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 929 - 934
  • [7] Derr Tyler, 2018, CIKM
  • [8] THE APPROXIMATION OF ONE MATRIX BY ANOTHER OF LOWER RANK
    Eckart, Carl
    Young, Gale
    [J]. PSYCHOMETRIKA, 1936, 1 (03) : 211 - 218
  • [9] Gilmer J, 2017, PR MACH LEARN RES, V70
  • [10] node2vec: Scalable Feature Learning for Networks
    Grover, Aditya
    Leskovec, Jure
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 855 - 864