LMACL: Improving Graph Collaborative Filtering with Learnable Model Augmentation Contrastive Learning

被引:2
|
作者
Liu, Xinru [1 ]
Hao, Yongjing [2 ]
Zhao, Lei [2 ]
Liu, Guanfeng [3 ]
Sheng, Victor S. [4 ]
Zhao, Pengpeng [2 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Jiangsu, Peoples R China
[2] Soochow Univ, Suzhou, Jiangsu, Peoples R China
[3] Macquarie Univ, Sydney, NSW, Australia
[4] Texas Tech Univ, Lubbock, TX USA
关键词
Recommender systems; collaborative filtering; graph neural network; contrastive learning;
D O I
10.1145/3657302
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Graph collaborative filtering (GCF) has achieved exciting recommendation performance with its ability to aggregate high-order graph structure information. Recently, contrastive learning (CL) has been incorporated into GCF to alleviate data sparsity and noise issues. However, most of the existing methods employ random or manual augmentation to produce contrastive views that may destroy the original topology and amplify the noisy effects. We argue that such augmentation is insufficient to produce the optimal contrastive view, leading to suboptimal recommendation results. In this article, we proposed a LearnableModel Augmentation Contrastive Learning (LMACL) framework for recommendation, which effectively combines graph-level and node-level collaborative relations to enhance the expressiveness of collaborative filtering (CF) paradigm. Specifically, we first use the graph convolution network (GCN) as a backbone encoder to incorporate multihop neighbors into graph-level original node representations by leveraging the high-order connectivity in user-item interaction graphs. At the same time, we treat the multi-head graph attention network (GAT) as an augmentation view generator to adaptively generate high-quality node-level augmented views. Finally, joint learning endows the end-to-end training fashion. In this case, the mutual supervision and collaborative cooperation of GCN and GAT achieves learnable model augmentation. Extensive experiments on several benchmark datasets demonstrate that LMACL provides a significant improvement over the strongest baseline in terms of Recall and NDCG by 2.5%-3.8% and 1.6%-4.0%, respectively. Our model implementation code is available at https://github.com/LiuHsinx/LMACL.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] Graph contrastive learning for recommendation with generative data augmentation
    Li, Xiaoge
    Wang, Yin
    Wang, Yihan
    An, Xiaochun
    MULTIMEDIA SYSTEMS, 2024, 30 (04)
  • [42] Contrastive learning for fair graph representations via counterfactual graph augmentation
    Li, Chengyu
    Cheng, Debo
    Zhang, Guixian
    Zhang, Shichao
    KNOWLEDGE-BASED SYSTEMS, 2024, 305
  • [43] On the Vulnerability of Graph Learning-based Collaborative Filtering
    Xu, Senrong
    Li, Liangyue
    Li, Zenan
    Yao, Yuan
    Xu, Feng
    Chen, Zulong
    Lu, Quan
    Tong, Hanghang
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (04)
  • [44] Improved Collaborative Recommendation Model: Integrating Knowledge Embedding and Graph Contrastive Learning
    Jiang, Liwei
    Yan, Guanghui
    Luo, Hao
    Chang, Wenwen
    ELECTRONICS, 2023, 12 (20)
  • [45] Improving Graph Collaborative Filtering via Spike Signal Embedding Perturbation
    Ma, Ying
    Chen, Gang
    Li, Guoqi
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) : 1688 - 1697
  • [46] Adaptive graph contrastive learning with joint optimization of data augmentation and graph encoder
    Zhenpeng Wu
    Jiamin Chen
    Raeed Al-Sabri
    Babatounde Moctard Oloulade
    Jianliang Gao
    Knowledge and Information Systems, 2024, 66 : 1657 - 1681
  • [47] Encoder augmentation for multi-task graph contrastive learning
    Wang, Xiaoyu
    Zhang, Qiqi
    Liu, Gen
    Zhao, Zhongying
    Cui, Hongzhi
    NEUROCOMPUTING, 2025, 630
  • [48] Adaptive graph contrastive learning with joint optimization of data augmentation and graph encoder
    Wu, Zhenpeng
    Chen, Jiamin
    Al-Sabri, Raeed
    Oloulade, Babatounde Moctard
    Gao, Jianliang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (03) : 1657 - 1681
  • [49] Improving Structural and Semantic Global Knowledge in Graph Contrastive Learning with Distillation
    Wen, Mi
    Wang, Hongwei
    Xue, Yunsheng
    Wu, Yi
    Wen, Hong
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT II, PAKDD 2024, 2024, 14646 : 364 - 375
  • [50] Time-Series Representation Feature Refinement with a Learnable Masking Augmentation Framework in Contrastive Learning
    Lee, Junyeop
    Ham, Insung
    Kim, Yongmin
    Ko, Hanseok
    SENSORS, 2024, 24 (24)