Multi-Scale Self-Supervised Graph Contrastive Learning With Injective Node Augmentation

被引:4
作者
Zhang, Haonan [1 ]
Ren, Yuyang [1 ]
Fu, Luoyi [1 ]
Wang, Xinbing [1 ]
Chen, Guihai [1 ]
Zhou, Chenghu [2 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai 200240, Peoples R China
[2] Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, Beijing 100045, Peoples R China
关键词
Graph contrastive learning; graph representation learning; node augmentation; self-supervised learning;
D O I
10.1109/TKDE.2023.3278463
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Contrastive Learning (GCL) with Graph Neural Networks (GNN) has emerged as a promising method for learning latent node representations in a self-supervised manner. Most of existing GCL methods employ random sampling for graph view augmentation and maximize the agreement of the node representations between the views. However, the random augmentation manner, which is likely to produce very similar graph view samplings, may easily result in incomplete nodal contextual information, thus weakening the discrimination of node representations. To this end, this paper proposes a novel trainable scheme from the perspective of node augmentation, which is theoretically proved to be injective and utilizes the subgraphs consisting of each node with its neighbors to enhance the distinguishability of nodal view. Notably, our proposed scheme tries to enrich node representations via a multi-scale contrastive training that integrates three different levels of training granularity, i.e., subgraph level, graph- and node-level contextual information. In particular, the subgraph-level objective between augmented and original node views is constructed to enhance the discrimination of node representations while graph- and node-level objectives with global and local information from the original graph are developed to improve the generalization ability of representations. Experiment results demonstrate that our framework outperforms existing state-of-the-art baselines and even surpasses several supervised counterparts on four real-world datasets for node classification.
引用
收藏
页码:261 / 274
页数:14
相关论文
共 52 条
  • [41] Self-Supervised Learning on Graphs: Contrastive, Generative, or Predictive
    Wu, Lirong
    Lin, Haitao
    Tan, Cheng
    Gao, Zhangyang
    Li, Stan Z.
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 4216 - 4235
  • [42] Xu K., 2018, arXiv
  • [43] CCGL: Contrastive Cascade Graph Learning
    Xu, Xovee
    Zhou, Fan
    Zhang, Kunpeng
    Liu, Siyuan
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) : 4539 - 4554
  • [44] Yang LQ, 2021, ADV NEUR IN, V34
  • [45] Yin Y., 2021, arXiv, DOI DOI 10.48550/ARXIV.2109.10259
  • [46] You JX, 2021, Arxiv, DOI arXiv:2101.10320
  • [47] You Y, 2020, ADV NEUR IN, V33
  • [48] Regularization on Augmented Data to Diversify Sparse Representation for Robust Image Classification
    Zeng, Shaoning
    Zhang, Bob
    Gou, Jianping
    Xu, Yong
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (06) : 4935 - 4948
  • [49] Generative and Contrastive Self-Supervised Learning for Graph Anomaly Detection
    Zheng, Yu
    Jin, Ming
    Liu, Yixin
    Chi, Lianhua
    Phan, Khoa T.
    Chen, Yi-Ping Phoebe
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (12) : 12220 - 12233
  • [50] Zhu YQ, 2021, Arxiv, DOI arXiv:2109.01116