Asymmetric Graph Contrastive Learning

被引:1
作者
Chang, Xinglong [1 ,2 ]
Wang, Jianrong [1 ,3 ]
Guo, Rui [3 ]
Wang, Yingkui [4 ]
Li, Weihao [5 ]
机构
[1] Tianjin Univ, Sch New Media & Commun, Tianjin 300350, Peoples R China
[2] Qijia Youdao Network Technol Beijing Co Ltd, Beijing 100012, Peoples R China
[3] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
[4] Tianjin Renai Coll, Dept Comp Sci & Technol, Tianjin 301636, Peoples R China
[5] Data61 CSIRO, Black Mt Labs, Canberra, ACT 2601, Australia
基金
中国国家自然科学基金;
关键词
contrastive learning; graph neural networks; graph representation learning; NEURAL-NETWORK;
D O I
10.3390/math11214505
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Learning effective graph representations in an unsupervised manner is a popular research topic in graph data analysis. Recently, contrastive learning has shown its success in unsupervised graph representation learning. However, how to avoid collapsing solutions for contrastive learning methods remains a critical challenge. In this paper, a simple method is proposed to solve this problem for graph representation learning, which is different from existing commonly used techniques (such as negative samples or predictor network). The proposed model mainly relies on an asymmetric design that consists of two graph neural networks (GNNs) with unequal depth layers to learn node representations from two augmented views and defines contrastive loss only based on positive sample pairs. The simple method has lower computational and memory complexity than existing methods. Furthermore, a theoretical analysis proves that the asymmetric design avoids collapsing solutions when training together with a stop-gradient operation. Our method is compared to nine state-of-the-art methods on six real-world datasets to demonstrate its validity and superiority. The ablation experiments further validated the essential role of the asymmetric architecture.
引用
收藏
页数:13
相关论文
共 55 条
  • [31] Page L., 1999, PAGERANK CITATION RA
  • [32] Deep multiplex graph infomax: Attentive multiplex network embedding using global information
    Park, Chanyoung
    Han, Jiawei
    Yu, Hwanjo
    [J]. KNOWLEDGE-BASED SYSTEMS, 2020, 197
  • [33] Graph Representation Learning via Graphical Mutual Information Maximization
    Peng, Zhen
    Huang, Wenbing
    Luo, Minnan
    Zheng, Qinghua
    Rong, Yu
    Xu, Tingyang
    Huang, Junzhou
    [J]. WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 259 - 270
  • [34] DeepWalk: Online Learning of Social Representations
    Perozzi, Bryan
    Al-Rfou, Rami
    Skiena, Steven
    [J]. PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, : 701 - 710
  • [35] GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
    Qiu, Jiezhong
    Chen, Qibin
    Dong, Yuxiao
    Zhang, Jing
    Yang, Hongxia
    Ding, Ming
    Wang, Kuansan
    Tang, Jie
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1150 - 1160
  • [36] The Graph Neural Network Model
    Scarselli, Franco
    Gori, Marco
    Tsoi, Ah Chung
    Hagenbuchner, Markus
    Monfardini, Gabriele
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (01): : 61 - 80
  • [37] Sun F.-Y., 2019, arXiv
  • [38] LINE: Large-scale Information Network Embedding
    Tang, Jian
    Qu, Meng
    Wang, Mingzhe
    Zhang, Ming
    Yan, Jun
    Mei, Qiaozhu
    [J]. PROCEEDINGS OF THE 24TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW 2015), 2015, : 1067 - 1077
  • [39] Thakoor S, 2023, Arxiv, DOI arXiv:2102.06514
  • [40] On the construction of some bioconjugate networks and their structural modeling via irregularity topological indices
    Ullah, Asad
    Zaman, Shahid
    Hamraz, Anila
    Muzammal, Muniba
    [J]. EUROPEAN PHYSICAL JOURNAL E, 2023, 46 (08)