SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation

被引:179
作者
Xia, Jun [1 ,2 ]
Wu, Lirong [1 ,2 ]
Chen, Jintao [3 ]
Hu, Bozhen [1 ,2 ]
Li, Stan Z. [1 ,2 ]
机构
[1] Westlake Univ, Sch Engn, Hangzhou 310030, Peoples R China
[2] Westlake Inst Adv Study, Inst Adv Technol, Hangzhou 310030, Peoples R China
[3] Zhejiang Univ, Hangzhou 310058, Peoples R China
来源
PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22) | 2022年
基金
中国国家自然科学基金;
关键词
Graph neural networks; graph self-supervised learning; contrastive learning; graph representation learning; robustness;
D O I
10.1145/3485447.3512156
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Graph contrastive learning (GCL) has emerged as a dominant technique for graph representation learning which maximizes the mutual information between paired graph augmentations that share the same semantics. Unfortunately, it is difficult to preserve semantics well during augmentations in view of the diverse nature of graph data. Currently, data augmentations in GCL broadly fall into three unsatisfactory ways. First, the augmentations can be manually picked per dataset by trial-and-errors. Second, the augmentations can be selected via cumbersome search. Third, the augmentations can be obtained with expensive domain knowledge as guidance. All of these limit the efficiency and more general applicability of existing GCL methods. To circumvent these crucial issues, we propose a Simple framework for GRAph Contrastive lEarning, SimGRACE for brevity, which does not require data augmentations. Specifically, we take original graph as input and GNN model with its perturbed version as two encoders to obtain two correlated views for contrast. SimGRACE is inspired by the observation that graph data can preserve their semantics well during encoder perturbations while not requiring manual trial-and-errors, cumbersome search or expensive domain knowledge for augmentations selection. Also, we explain why SimGRACE can succeed. Furthermore, we devise adversarial training scheme, dubbed AT-SimGRACE, to enhance the robustness of graph contrastive learning and theoretically explain the reasons. Albeit simple, we show that SimGRACE can yield competitive or better performance compared with state-of-the-art methods in terms of generalizability, transferability and robustness, while enjoying unprecedented degree of flexibility and efficiency. The code is available at: https://github.com/junxia97/SimGRACE.
引用
收藏
页码:1070 / 1079
页数:10
相关论文
共 61 条
[1]   Sub2Vec: Feature Learning for Subgraphs [J].
Adhikari, Bijaya ;
Zhang, Yao ;
Ramakrishnan, Naren ;
Prakash, B. Aditya .
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2018, PT II, 2018, 10938 :170-182
[2]  
[Anonymous], WWW
[3]  
[Anonymous], 2018, INT C MACH LEARN
[4]  
[Anonymous], 2019, PROC INT CONF PARAL, DOI DOI 10.1145/3337821.3337923
[5]  
Benedek Rozemberczki, 2020, API ORIENTED OPEN SO
[6]  
Chen Liu, 2020, NIPS 2020
[7]  
Chen T., 2020, ICML
[8]   On Sampling Strategies for Neural Network-based Collaborative Filtering [J].
Chen, Ting ;
Sun, Yizhou ;
Shi, Yue ;
Hong, Liangjie .
KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, :767-776
[9]  
Chen Ting, 2019, ARE POWERFUL GRAPH N
[10]  
Dai H., 2016, ICML, P2702