Graph Autoencoder with Preserving Node Attribute Similarity

被引:9
作者
Lin, Mugang [1 ,2 ]
Wen, Kunhui [1 ]
Zhu, Xuanying [1 ]
Zhao, Huihuang [1 ,2 ]
Sun, Xianfang [3 ]
机构
[1] Hengyang Normal Univ, Coll Comp Sci & Technol, Hengyang 421002, Peoples R China
[2] Hunan Prov Key Lab Intelligent Informat Proc & App, Hengyang 421002, Peoples R China
[3] Cardiff Univ, Sch Comp Sci & Informat, Cardiff CF24 4AG, Wales
基金
中国国家自然科学基金;
关键词
graph representation learning; graph autoencoder; unsupervised learning; k-nearest neighbor;
D O I
10.3390/e25040567
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The graph autoencoder (GAE) is a powerful graph representation learning tool in an unsupervised learning manner for graph data. However, most existing GAE-based methods typically focus on preserving the graph topological structure by reconstructing the adjacency matrix while ignoring the preservation of the attribute information of nodes. Thus, the node attributes cannot be fully learned and the ability of the GAE to learn higher-quality representations is weakened. To address the issue, this paper proposes a novel GAE model that preserves node attribute similarity. The structural graph and the attribute neighbor graph, which is constructed based on the attribute similarity between nodes, are integrated as the encoder input using an effective fusion strategy. In the encoder, the attributes of the nodes can be aggregated both in their structural neighborhood and by their attribute similarity in their attribute neighborhood. This allows performing the fusion of the structural and node attribute information in the node representation by sharing the same encoder. In the decoder module, the adjacency matrix and the attribute similarity matrix of the nodes are reconstructed using dual decoders. The cross-entropy loss of the reconstructed adjacency matrix and the mean-squared error loss of the reconstructed node attribute similarity matrix are used to update the model parameters and ensure that the node representation preserves the original structural and node attribute similarity information. Extensive experiments on three citation networks show that the proposed method outperforms state-of-the-art algorithms in link prediction and node clustering tasks.
引用
收藏
页数:22
相关论文
共 45 条
  • [21] Asymmetric Transitivity Preserving Graph Embedding
    Ou, Mingdong
    Cui, Peng
    Pei, Jian
    Zhang, Ziwei
    Zhu, Wenwu
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 1105 - 1114
  • [22] Pan SR, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2609
  • [23] Park J, 2022, AAAI CONF ARTIF INTE, P7949
  • [24] Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning
    Park, Jiwoong
    Lee, Minsik
    Chang, Hyung Jin
    Lee, Kyuewang
    Choi, Jin Young
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6518 - 6527
  • [25] DeepWalk: Online Learning of Social Representations
    Perozzi, Bryan
    Al-Rfou, Rami
    Skiena, Steven
    [J]. PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, : 701 - 710
  • [26] Salha G., 2020, MACHINE LEARNING KNO, P319
  • [27] Gravity-Inspired Graph Autoencoders for Directed Link Prediction
    Salha, Guillaume
    Limnios, Stratis
    Hennequin, Romain
    Viet Anh Tran
    Vazirgiannis, Michalis
    [J]. PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 589 - 598
  • [28] Su W., 2015, ICTIR, P349, DOI DOI 10.1145/2808194.2809481
  • [29] A2AE: Towards adaptive multi-view graph representation learning via all-to-all graph autoencoder architecture
    Sun, Dengdi
    Li, Dashuang
    Ding, Zhuanlian
    Zhang, Xingyi
    Tang, Jin
    [J]. APPLIED SOFT COMPUTING, 2022, 125
  • [30] Dual-decoder graph autoencoder for unsupervised graph representation learning
    Sun, Dengdi
    Li, Dashuang
    Ding, Zhuanlian
    Zhang, Xingyi
    Tang, Jin
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 234