Self-Guided Learning to Denoise for Robust Recommendation

被引:44
作者
Gao, Yunjun [1 ]
Du, Yuntao [1 ]
Hu, Yujia [1 ]
Chen, Lu [1 ]
Zhu, Xinjun [1 ]
Fang, Ziquan [1 ]
Zheng, Baihua [2 ]
机构
[1] Zhejiang Univ, Coll Comp Sci, Hangzhou, Zhejiang, Peoples R China
[2] Singapore Management Univ, Sch Comp & Informat Syst, Singapore, Singapore
来源
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22) | 2022年
关键词
Recommender System; Denoising Recommendation; Implicit Feed-back; Robust Learning;
D O I
10.1145/3477495.3532059
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The ubiquity of implicit feedback makes them the default choice to build modern recommender systems. Generally speaking, observed interactions are considered as positive samples, while unobserved interactions are considered as negative ones. However, implicit feedback is inherently noisy because of the ubiquitous presence of noisy-positive and noisy-negative interactions. Recently, some studies have noticed the importance of denoising implicit feedback for recommendations, and enhanced the robustness of recommendation models to some extent. Nonetheless, they typically fail to (1) capture the hard yet clean interactions for learning comprehensive user preference, and (2) provide a universal denoising solution that can be applied to various kinds of recommendation models. In this paper, we thoroughly investigate the memorization effect of recommendation models, and propose a new denoising paradigm, i.e., Self-Guided Denoising Learning (SGDL), which is able to collect memorized interactions at the early stage of the training (i.e., "noise-resistant" period), and leverage those data as denoising signals to guide the following training (i.e., "noise-sensitive" period) of the model in a meta-learning manner. Besides, our method can automatically switch its learning phase at the memorization point from memorization to self-guided learning, and select clean and informative memorized data via a novel adaptive denoising scheduler to improve the robustness. We incorporate SGDL with four representative recommendation models (i.e., NeuMF, CDAE, NGCF and Light-GCN) and different loss functions (i.e., binary cross-entropy and BPR loss). The experimental results on three benchmark datasets demonstrate the effectiveness of SGDL over the state-of-the-art denoising methods like T-CE, IR, DeCA, and even state-of-the-art robust graph-based methods like SGCN and SGL.
引用
收藏
页码:1412 / 1422
页数:11
相关论文
共 52 条
[1]  
[Anonymous], 2012, P KDD CUP 2011
[2]  
[Anonymous], 2010, RecSys'10-Proceedings of the 4th ACM Conference on Recommender Systems, DOI [DOI 10.1145/1864708.1864721, 10.1145/1864708.1864721]
[3]  
[Anonymous], 2019, ADV NEUR IN
[4]  
Arazo E, 2019, PR MACH LEARN RES, V97
[5]  
Arpit D, 2017, PR MACH LEARN RES, V70
[6]   Denoising User-aware Memory Network for Recommendation [J].
Bian, Zhi ;
Zhou, Shaojun ;
Fu, Hao ;
Yang, Qihong ;
Sun, Zhenqi ;
Tang, Junjie ;
Liu, Guiquan ;
Liu, Kaikui ;
Li, Xiaolong .
15TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS 2021), 2021, :400-410
[7]  
Chang HS, 2017, ADV NEUR IN, V30
[8]   Structured Graph Convolutional Networks with Stochastic Masks for Recommender Systems [J].
Chen, Huiyuan ;
Wang, Lan ;
Lin, Yusan ;
Yeh, Chin-Chia Michael ;
Wang, Fei ;
Yang, Hao .
SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, :614-623
[9]   AutoDebias: Learning to Debias for Recommendation [J].
Chen, Jiawei ;
Dong, Hande ;
Qiu, Yang ;
He, Xiangnan ;
Xin, Xin ;
Chen, Liang ;
Lin, Guli ;
Yang, Keping .
SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, :21-30
[10]  
Chen Jiawei, 2020, ARXIV201003240