A Preference Learning Decoupling Framework for User Cold-Start Recommendation

被引:6
作者
Wang, Chunyang [1 ]
Zhu, Yanmin [1 ]
Sun, Aixin [2 ]
Wang, Zhaobo [1 ]
Wang, Ke [1 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[2] Nanyang Technol Univ, Singapore, Singapore
来源
PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023 | 2023年
基金
美国国家科学基金会;
关键词
Recommendation; Cold-start; Meta-Learning; Task Augmentation;
D O I
10.1145/3539618.3591627
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The issue of user cold-start poses a long-standing challenge to recommendation systems, due to the scarce interactions of new users. Recently, meta-learning based studies treat each cold-start user as a user-specific few-shot task and then derive meta-knowledge about fast model adaptation across training users. However, existing solutions mostly do not clearly distinguish the concept of new users and the concept of novel preferences, leading to over-reliance on meta-learning based adaptability to novel patterns. In addition, we also argue that the existing meta-training task construction inherently suffers from the memorization overfitting issue, which inevitably hinders meta-generalization to new users. In response to the aforementioned issues, we propose a preference learning decoupling framework, which is enhanced with meta-augmentation (PDMA), for user cold-start recommendation. To rescue the meta-learning from unnecessary adaptation to common patterns, our framework decouples preference learning for a cold-start user into two complementary aspects: common preference transfer, and novel preference adaptation. To handle the memorization overfitting issue, we further propose to augment meta-training users by injecting attribute-based noises, to achieve mutually-exclusive tasks. Extensive experiments on benchmark datasets demonstrate that our framework achieves superior performance improvements against state-of-the-art methods. We also show that our proposed framework is effective in alleviating memorization overfitting.
引用
收藏
页码:1168 / 1177
页数:10
相关论文
共 47 条
[1]  
[Anonymous], 2017, P NEURIPS
[2]  
BHARADHWAJ H, 2019, IEEE IJCNN
[3]   MAMO: Memory-Augmented Meta-Optimization for Cold-start Recommendation [J].
Dong, Manqing ;
Yuan, Feng ;
Yao, Lina ;
Xu, Xiwei ;
Zhu, Liming .
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, :688-697
[4]   Sequential Scenario-Specific Meta Learner for Online Recommendation [J].
Du, Zhengxiao ;
Wang, Xiaowei ;
Yang, Hongxia ;
Zhou, Jingren ;
Tang, Jie .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :2895-2904
[5]   CMML: Contextual Modulation Meta Learning for Cold-Start Recommendation [J].
Feng, Xidong ;
Chen, Chen ;
Li, Dong ;
Zhao, Mengchen ;
Hao, Jianye ;
Wang, Jun .
PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, :484-493
[6]  
Finn C, 2017, PR MACH LEARN RES, V70
[7]  
Gantner Zeno, 2010, P ICDM, P176
[8]  
Gao L, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P3378
[9]   Neural Collaborative Filtering [J].
He, Xiangnan ;
Liao, Lizi ;
Zhang, Hanwang ;
Nie, Liqiang ;
Hu, Xia ;
Chua, Tat-Seng .
PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'17), 2017, :173-182
[10]   Meta-Learning in Neural Networks: A Survey [J].
Hospedales, Timothy ;
Antoniou, Antreas ;
Micaelli, Paul ;
Storkey, Amos .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) :5149-5169