Are Graph Augmentations Necessary? Simple Graph Contrastive Learning for Recommendation

被引:586
作者
Yu, Junliang [1 ]
Yin, Hongzhi [1 ]
Xia, Xin [1 ]
Chen, Tong [1 ]
Cui, Lizhen [2 ]
Nguyen, Quoc Viet Hung [3 ]
机构
[1] Univ Queensland, Brisbane, Qld, Australia
[2] Shandong Univ, Jinan, Peoples R China
[3] Griffith Univ, Gold Coast, Australia
来源
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22) | 2022年
基金
澳大利亚研究理事会;
关键词
Self-Supervised Learning; Recommendation; Contrastive Learning; Data Augmentation;
D O I
10.1145/3477495.3531937
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Contrastive learning (CL) recently has spurred a fruitful line of research in the field of recommendation, since its ability to extract self-supervised signals from the raw data is well-aligned with recommender systems' needs for tackling the data sparsity issue. A typical pipeline of CL-based recommendation models is first augmenting the user-item bipartite graph with structure perturbations, and then maximizing the node representation consistency between different graph augmentations. Although this paradigm turns out to be effective, what underlies the performance gains is still a mystery. In this paper, we first experimentally disclose that, in CL-based recommendation models, CL operates by learning more uniform user/item representations that can implicitly mitigate the popularity bias. Meanwhile, we reveal that the graph augmentations, which used to be considered necessary, just play a trivial role. Based on this finding, we propose a simple CL method which discards the graph augmentations and instead adds uniform noises to the embedding space for creating contrastive views. A comprehensive experimental study on three benchmark datasets demonstrates that, though it appears strikingly simple, the proposed method can smoothly adjust the uniformity of learned representations and has distinct advantages over its graph augmentation-based counterparts in terms of recommendation accuracy and training efficiency. The code is released at https://github.com/Coder-Yu/QRec.
引用
收藏
页码:1294 / 1303
页数:10
相关论文
共 46 条
[1]  
[Anonymous], 2020, INT C MACH LEARN
[2]  
Bachman P, 2019, ADV NEUR IN, V32
[3]   KERNEL DENSITY ESTIMATION VIA DIFFUSION [J].
Botev, Z. I. ;
Grotowski, J. F. ;
Kroese, D. P. .
ANNALS OF STATISTICS, 2010, 38 (05) :2916-2957
[4]  
Chen Jiawei, 2020, ARXIV201003240
[5]   Bootstrapping User and Item Representations for One-Class Collaborative Filtering [J].
Lee, Dongha ;
Kang, SeongKu ;
Ju, Hyunjun ;
Park, Chanyoung ;
Yu, Hwanjo .
SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, :317-326
[6]  
Gao Chen, 2021, ARXIV210912843
[7]  
Gao TY, 2021, 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), P6894
[8]   Automating the Removal of Obsolete TODO Comments [J].
Gao, Zhipeng ;
Xia, Xin ;
Lo, David ;
Grundy, John ;
Zimmermann, Thomas .
PROCEEDINGS OF THE 29TH ACM JOINT MEETING ON EUROPEAN SOFTWARE ENGINEERING CONFERENCE AND SYMPOSIUM ON THE FOUNDATIONS OF SOFTWARE ENGINEERING (ESEC/FSE '21), 2021, :218-229
[9]  
Goodfellow I. J., 2015, ICLR
[10]   LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation [J].
He, Xiangnan ;
Deng, Kuan ;
Wang, Xiang ;
Li, Yan ;
Zhang, Yongdong ;
Wang, Meng .
PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, :639-648