CLEAR: Cluster-Enhanced Contrast for Self-Supervised Graph Representation Learning

被引:30
作者
Luo, Xiao [1 ]
Ju, Wei [2 ]
Qu, Meng [3 ]
Gu, Yiyang [2 ]
Chen, Chong [4 ]
Deng, Minghua [1 ]
Hua, Xian-Sheng [4 ]
Zhang, Ming [2 ]
机构
[1] Peking Univ, Sch Math Sci, Beijing 100871, Peoples R China
[2] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[3] Univ Montreal, Mila Quebec AI Inst, Montreal, PQ H3T 1J4, Canada
[4] Alibaba Grp, Discovery Adventure Momentum & Outlook DAMO Acad, Hangzhou 311100, Peoples R China
基金
中国国家自然科学基金;
关键词
Contrastive learning (CL); graph clustering; graph representation learning; self-supervised learning; PREDICTION; NETWORK; CUTS;
D O I
10.1109/TNNLS.2022.3177775
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article studies self-supervised graph representation learning, which is critical to various tasks, such as protein property prediction. Existing methods typically aggregate representations of each individual node as graph representations, but fail to comprehensively explore local substructures (i.e., motifs and subgraphs), which also play important roles in many graph mining tasks. In this article, we propose a self-supervised graph representation learning framework named cluster-enhanced Contrast (CLEAR) that models the structural semantics of a graph from graph-level and substructure-level granularities, i.e., global semantics and local semantics, respectively. Specifically, we use graph-level augmentation strategies followed by a graph neural network-based encoder to explore global semantics. As for local semantics, we first use graph clustering techniques to partition each whole graph into several subgraphs while preserving as much semantic information as possible. We further employ a self-attention interaction module to aggregate the semantics of all subgraphs into a local-view graph representation. Moreover, we integrate both global semantics and local semantics into a multiview graph contrastive learning framework, enhancing the semantic-discriminative ability of graph representations. Extensive experiments on various real-world benchmarks demonstrate the efficacy of the proposed CLEAR over current graph self-supervised representation learning approaches on both graph classification and transfer learning tasks.
引用
收藏
页码:899 / 912
页数:14
相关论文
共 75 条
  • [1] Sub2Vec: Feature Learning for Subgraphs
    Adhikari, Bijaya
    Zhang, Yao
    Ramakrishnan, Naren
    Prakash, B. Aditya
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2018, PT II, 2018, 10938 : 170 - 182
  • [2] Perspective: Fifty years of density-functional theory in chemical physics
    Becke, Axel D.
    [J]. JOURNAL OF CHEMICAL PHYSICS, 2014, 140 (18)
  • [3] Blum A., 1998, Proceedings of the Eleventh Annual Conference on Computational Learning Theory, P92, DOI 10.1145/279943.279962
  • [4] Shortest-path kernels on graphs
    Borgwardt, KM
    Kriegel, HP
    [J]. Fifth IEEE International Conference on Data Mining, Proceedings, 2005, : 74 - 81
  • [5] Protein function prediction via graph kernels
    Borgwardt, KM
    Ong, CS
    Schönauer, S
    Vishwanathan, SVN
    Smola, AJ
    Kriegel, HP
    [J]. BIOINFORMATICS, 2005, 21 : I47 - I56
  • [6] Line Graph Neural Networks for Link Prediction
    Cai, Lei
    Li, Jundong
    Wang, Jie
    Ji, Shuiwang
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 5103 - 5113
  • [7] LIBSVM: A Library for Support Vector Machines
    Chang, Chih-Chung
    Lin, Chih-Jen
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
  • [8] Comparison of Random Forest and Pipeline Pilot Naive Bayes in Prospective QSAR Predictions
    Chen, Bin
    Sheridan, Robert P.
    Hornak, Viktor
    Voigt, Johannes H.
    [J]. JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2012, 52 (03) : 792 - 803
  • [9] Chen T, 2020, PR MACH LEARN RES, V119
  • [10] Efficient Similarity-Aware Influence Maximization in Geo-Social Network
    Chen, Xuanhao
    Zhao, Yan
    Liu, Guanfeng
    Sun, Rui
    Zhou, Xiaofang
    Zheng, Kai
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (10) : 4767 - 4780