Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification

被引:4
|
作者
Adjeisah, Michael [1 ,2 ]
Zhu, Xinzhong [2 ,3 ]
Xu, Huiying [2 ,3 ]
Ayall, Tewodros Alemu [4 ]
机构
[1] Bournemouth Univ, Natl Ctr Comp Animat, Bournemouth BH12 5BB, England
[2] Zhejiang Normal Univ, Coll Comp Sci & Technol, Jinhua 321004, Zhejiang, Peoples R China
[3] Artificial Intelligence Res Inst Beijing Geekplus, Beijing 100101, Peoples R China
[4] Univ Aberdeen, Sch Nat & Comp Sci, Aberdeen AB24 3UE, Scotland
关键词
Contrastive learning; Graph classification; Graph neural network; Multi-view representation learning; Pre-trained embeddings;
D O I
10.1016/j.knosys.2024.112112
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent advancements in node and graph classification tasks can be attributed to the implementation of contrastive learning and similarity search. Despite considerable progress, these approaches present challenges. The integration of similarity search introduces an additional layer of complexity to the model. At the same time, applying contrastive learning to non-transferable domains or out-of-domain datasets results in less competitive outcomes. In this work, we propose maintaining domain specificity for these tasks, which has demonstrated the potential to improve performance by eliminating the need for additional similarity searches. We adopt a fraction of domain-specific datasets for pre-training purposes, generating augmented pairs that retain structural similarity to the original graph, thereby broadening the number of views. This strategy involves a comprehensive exploration of optimal augmentations to devise multi-view embeddings. An evaluation protocol, which focuses on error minimization, accuracy enhancement, and overfitting prevention, guides this process to learn inherent, transferable structural representations that span diverse datasets. We combine pre-trained embeddings and the source graph as a beneficial input, leveraging local and global graph information to enrich downstream tasks. Furthermore, to maximize the utility of negative samples in contrastive learning, we extend the training mechanism during the pre-training stage. Our method consistently outperforms comparative baseline approaches in comprehensive experiments conducted on benchmark graph datasets of varying sizes and characteristics, establishing new state-of-the-art results.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Contrastive Language-knowledge Graph Pre-training
    Yuan, Xiaowei
    Liu, Kang
    Wang, Yequan
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (04)
  • [22] Graph Contrastive Partial Multi-View Clustering
    Wang, Yiming
    Chang, Dongxia
    Fu, Zhiqiang
    Wen, Jie
    Zhao, Yao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 (6551-6562) : 6551 - 6562
  • [23] SMGCL: Semi-supervised Multi-view Graph Contrastive Learning
    Zhou, Hui
    Gong, Maoguo
    Wang, Shanfeng
    Gao, Yuan
    Zhao, Zhongying
    KNOWLEDGE-BASED SYSTEMS, 2023, 260
  • [24] Multi-View Graph Contrastive Learning for Solving Vehicle Routing Problems
    Jiang, Yuan
    Cao, Zhiguang
    Wu, Yaoxin
    Zhang, Jie
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 984 - 994
  • [25] Modeling Multi-View Interactions with Contrastive Graph Learning for Collaborative Filtering
    Cheng, Zhangtao
    Walker, Joojo
    Zhong, Ting
    Zhou, Fan
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [26] Adaptive Fusion of Multi-View for Graph Contrastive Recommendation
    Yang, Mengduo
    Yuan, Yi
    Zhou, Jie
    Xi, Meng
    Pan, Xiaohua
    Li, Ying
    Wu, Yangyang
    Zhang, Jinshan
    Yin, Jianwei
    PROCEEDINGS OF THE EIGHTEENTH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2024, 2024, : 228 - 237
  • [27] Graph Structure Aware Contrastive Multi-View Clustering
    Chen, Rui
    Tang, Yongqiang
    Cai, Xiangrui
    Yuan, Xiaojie
    Feng, Wenlong
    Zhang, Wensheng
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (03) : 260 - 274
  • [28] Graph Based Multi-View Learning for CDL Relation Classification
    Li, Haibo
    Matsuo, Yutaka
    Ishizuka, Mitsuru
    2009 IEEE THIRD INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC 2009), 2009, : 473 - 480
  • [29] Graph Contrastive Pre-training for Anti-money Laundering
    Lu, Hanbin
    Wang, Haosen
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2024, 17 (01)
  • [30] Multi-view Negative-Free Contrastive Learning on Adaptive Graph Augmentation
    Wang, Xingyue
    Liu, Huazhong
    Ding, Jihong
    Tan, Peng
    2024 IEEE 24TH INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING, CCGRID 2024, 2024, : 291 - 298