Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification

被引:4
|
作者
Adjeisah, Michael [1 ,2 ]
Zhu, Xinzhong [2 ,3 ]
Xu, Huiying [2 ,3 ]
Ayall, Tewodros Alemu [4 ]
机构
[1] Bournemouth Univ, Natl Ctr Comp Animat, Bournemouth BH12 5BB, England
[2] Zhejiang Normal Univ, Coll Comp Sci & Technol, Jinhua 321004, Zhejiang, Peoples R China
[3] Artificial Intelligence Res Inst Beijing Geekplus, Beijing 100101, Peoples R China
[4] Univ Aberdeen, Sch Nat & Comp Sci, Aberdeen AB24 3UE, Scotland
关键词
Contrastive learning; Graph classification; Graph neural network; Multi-view representation learning; Pre-trained embeddings;
D O I
10.1016/j.knosys.2024.112112
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent advancements in node and graph classification tasks can be attributed to the implementation of contrastive learning and similarity search. Despite considerable progress, these approaches present challenges. The integration of similarity search introduces an additional layer of complexity to the model. At the same time, applying contrastive learning to non-transferable domains or out-of-domain datasets results in less competitive outcomes. In this work, we propose maintaining domain specificity for these tasks, which has demonstrated the potential to improve performance by eliminating the need for additional similarity searches. We adopt a fraction of domain-specific datasets for pre-training purposes, generating augmented pairs that retain structural similarity to the original graph, thereby broadening the number of views. This strategy involves a comprehensive exploration of optimal augmentations to devise multi-view embeddings. An evaluation protocol, which focuses on error minimization, accuracy enhancement, and overfitting prevention, guides this process to learn inherent, transferable structural representations that span diverse datasets. We combine pre-trained embeddings and the source graph as a beneficial input, leveraging local and global graph information to enrich downstream tasks. Furthermore, to maximize the utility of negative samples in contrastive learning, we extend the training mechanism during the pre-training stage. Our method consistently outperforms comparative baseline approaches in comprehensive experiments conducted on benchmark graph datasets of varying sizes and characteristics, establishing new state-of-the-art results.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] A Multi-view Molecular Pre-training with Generative Contrastive Learning
    Liu, Yunwu
    Zhang, Ruisheng
    Yuan, Yongna
    Ma, Jun
    Li, Tongfeng
    Yu, Zhixuan
    INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2024, 16 (03) : 741 - 754
  • [2] Multi-view graph contrastive learning for social recommendation
    Chen, Rui
    Chen, Jialu
    Gan, Xianghua
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [3] Heterogeneous Graph Contrastive Multi-view Learning
    Wang, Zehong
    Li, Qi
    Yu, Donghua
    Han, Xiaolong
    Gao, Xiao-Zhi
    Shen, Shigen
    PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 136 - 144
  • [4] Contrastive and attentive graph learning for multi-view clustering
    Wang, Ru
    Li, Lin
    Tao, Xiaohui
    Wang, Peipei
    Liu, Peiyu
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (04)
  • [5] A Multi-View Graph Contrastive Learning Framework for Defending Against Adversarial Attacks
    Cao, Feilong
    Yee, Xing
    Yee, Hailiang
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (06): : 4022 - 4032
  • [6] SMGCL: Semi-supervised Multi-view Graph Contrastive Learning
    Zhou, Hui
    Gong, Maoguo
    Wang, Shanfeng
    Gao, Yuan
    Zhao, Zhongying
    KNOWLEDGE-BASED SYSTEMS, 2023, 260
  • [7] Enhancing Multi-view Contrastive Learning for Graph Anomaly Detection
    Lu, Qingcheng
    Wu, Nannan
    Zhao, Yiming
    Wang, Wenjun
    Zu, Quannan
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, PT VI, DASFAA 2024, 2024, 14855 : 236 - 251
  • [8] Multi-View Contrastive Enhanced Heterogeneous Graph Structure Learning
    Bing R.
    Yuan G.
    Meng F.
    Wang S.
    Qiao S.
    Wang Z.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (10):
  • [9] GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
    Qiu, Jiezhong
    Chen, Qibin
    Dong, Yuxiao
    Zhang, Jing
    Yang, Hongxia
    Ding, Ming
    Wang, Kuansan
    Tang, Jie
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1150 - 1160
  • [10] A Multi-view Graph Contrastive Learning Framework for Cross-Domain Sequential Recommendation
    Xu, Zitao
    Pan, Weike
    Ming, Zhong
    PROCEEDINGS OF THE 17TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2023, 2023, : 491 - 501