DisenCDR: Learning Disentangled Representations for Cross-Domain Recommendation

被引:78
作者
Cao, Jiangxia [1 ]
Lin, Xixun [1 ]
Cong, Xin [1 ]
Ya, Jing [1 ]
Liu, Tingwen [1 ]
Wang, Bin [2 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Sch Cyber Secur, Univ Chinese Acad Sci, Beijing, Peoples R China
[2] Xiaomi Inc, Xiaomi AI Lab, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22) | 2022年
关键词
Cross-Domain Recommendation; Variational Autoencoder; Disentangled Representation Learning;
D O I
10.1145/3477495.3531967
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Data sparsity is a long-standing problem in recommender systems. To alleviate it, Cross-Domain Recommendation (CDR) has attracted a surge of interests, which utilizes the rich user-item interaction information from the related source domain to improve the performance on the sparse target domain. Recent CDR approaches pay attention to aggregating the source domain information to generate better user representations for the target domain. However, they focus on designing more powerful interaction encoders to learn both domains simultaneously, but fail to model different user preferences of different domains. Particularly, domain-specific preferences of the source domain usually provide useless information to enhance the performance in the target domain, and directly aggregating the domain-shared and domain-specific information together maybe hurts target domain performance. This work considers a key challenge of CDR: How do we transfer shared information across domains? Grounded in the information theory, we propose DisenCDR, a novel model to disentangle the domain-shared and domain-specific information. To reach our goal, we propose two mutual-information-based disentanglement regularizers. Specifically, an exclusive regularizer aims to enforce the user domain-shared representations and domain-specific representations encoding exclusive information. An information regularizer is to encourage the user domain-shared representations encoding predictive information for both domains. Based on them, we further derive a tractable bound of our disentanglement objective to learn desirable disentangled representations. Extensive experiments show that DisenCDR achieves significant improvements over state-of-the-art baselines on four real-world datasets.
引用
收藏
页码:267 / 277
页数:11
相关论文
共 63 条
[51]   INFORMATION THEORETICAL ANALYSIS OF MULTIVARIATE CORRELATION [J].
WATANABE, S .
IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 1960, 4 (01) :66-82
[52]  
Wu F, 2019, PR MACH LEARN RES, V97
[53]   A Comprehensive Survey on Graph Neural Networks [J].
Wu, Zonghan ;
Pan, Shirui ;
Chen, Fengwen ;
Long, Guodong ;
Zhang, Chengqi ;
Yu, Philip S. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (01) :4-24
[54]  
Xu Kun, 2021, ACM INT C INFORM KNO
[55]   Graph Convolutional Neural Networks for Web-Scale Recommender Systems [J].
Ying, Rex ;
He, Ruining ;
Chen, Kaifeng ;
Eksombatchai, Pong ;
Hamilton, William L. ;
Leskovec, Jure .
KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, :974-983
[56]  
Yuan F, 2019, PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P4227
[57]  
Zang T, 2021, ARXIV
[58]   CATN: Cross-Domain Recommendation for Cold-Start Users via Aspect Transfer Network [J].
Zhao, Cheng ;
Li, Chenliang ;
Xiao, Rong ;
Deng, Hongbo ;
Sun, Aixin .
PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, :229-238
[59]   Cross-Domain Recommendation via Preference Propagation GraphNet [J].
Zhao, Cheng ;
Li, Chenliang ;
Fu, Cong .
PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, :2165-2168
[60]  
Zhu F, 2021, PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, P4721