Cross-domain recommendation via knowledge distillation

被引:1
作者
Li, Xiuze [1 ]
Huang, Zhenhua [1 ,2 ]
Wu, Zhengyang [2 ]
Wang, Changdong [3 ]
Chen, Yunwen [4 ]
机构
[1] South China Normal Univ, Sch Artificial Intelligence, Foshan 528225, Guangdong, Peoples R China
[2] South China Normal Univ, Sch Comp Sci, Guangzhou 510631, Guangdong, Peoples R China
[3] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Guangdong, Peoples R China
[4] DataGrandInc, Res & Dev Dept, Shanghai 201203, Peoples R China
基金
中国国家自然科学基金;
关键词
Cross-domain recommendation; Knowledge distillation; Variational graph autoencoder;
D O I
10.1016/j.knosys.2025.113112
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recommendation systems frequently suffer from data sparsity, resulting in less-than-ideal recommendations. A prominent solution to this problem is Cross-Domain Recommendation (CDR), which employs data from various domains to mitigate data sparsity and cold-start issues. Nevertheless, current mainstream methods, like feature mapping and co-training exploring domain relationships, overlook latent user-user and user-item similarities in the shared user-item interaction graph. Spurred by these deficiencies, this paper introduces KDCDR, a novel cross-domain recommendation framework that relies on knowledge distillation to utilize the data from the graph. KDCDR aims to improve the recommendation performance in both domains by efficiently utilizing information from the shared interaction graph. Furthermore, we enhance the effectiveness of user and item representations by exploring the relationships between user-user similarity and item-item similarity, as well as user-item interactions. The developed scheme utilizes the inner-domain graph as a teacher and the cross-domain graph as a student, where the student learns by distilling knowledge from the two teachers after undergoing a high-temperature distillation process. Furthermore, we introduce dynamic weight that regulates the learning process to prevent the student network from overly favoring learning from one domain and focusing on learning knowledge that the teachers have taught incorrectly. Through extensive experiments on four real-world datasets, KDCDR demonstrates significant improvements over state-of-the-art methods, proving the effectiveness of KDCDR in addressing data sparsity issues and enhancing cross-domain recommendation performance. Our code and data are available at https://github.com/pandas-bondage/KDCDR.
引用
收藏
页数:14
相关论文
共 77 条
[1]   Variational Graph Normalized AutoEncoders [J].
Ahn, Seong Jin ;
Kim, MyoungHo .
PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, :2827-2831
[2]  
[Anonymous], 2016, JOINT EUR C MACH LEA
[3]  
Brockschmidt M, 2019, 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019)
[4]  
Cai Xuheng, 2023, 11 INT C LEARN REPR
[5]   Towards Universal Cross-Domain RecommendationTowards Universal Cross-Domain Recommendation [J].
Cao, Jiangxia ;
Li, Shaoshuai ;
Yu, Bowen ;
Guo, Xiaobo ;
Liu, Tingwen ;
Bin Wang .
PROCEEDINGS OF THE SIXTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, WSDM 2023, VOL 1, 2023, :78-86
[6]   DisenCDR: Learning Disentangled Representations for Cross-Domain Recommendation [J].
Cao, Jiangxia ;
Lin, Xixun ;
Cong, Xin ;
Ya, Jing ;
Liu, Tingwen ;
Wang, Bin .
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, :267-277
[7]   Cross-Domain Recommendation to Cold-Start Users via Variational Information Bottleneck [J].
Cao, Jiangxia ;
Sheng, Jiawei ;
Cong, Xin ;
Liu, Tingwen ;
Wang, Bin .
2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, :2209-2223
[8]  
Carratino L, 2022, J MACH LEARN RES, V23
[9]   Unbiased Knowledge Distillation for Recommendation [J].
Chen, Gang ;
Chen, Jiawei ;
Feng, Fuli ;
Zhou, Sheng ;
He, Xiangnan .
PROCEEDINGS OF THE SIXTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, WSDM 2023, VOL 1, 2023, :976-984
[10]  
Chen Xu, 2024, WWW '24: Companion Proceedings of the ACM on Web Conference 2024, P423, DOI 10.1145/3589335.3648341