Dual Contrastive Network for Sequential Recommendation

被引:17
作者
Lin, Guanyu [1 ]
Gao, Chen [1 ]
Li, Yinfeng [1 ]
Zheng, Yu [1 ]
Li, Zhiheng [2 ]
Jin, Depeng [1 ]
Li, Yong [1 ]
机构
[1] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Dept Elect Engn, Beijing, Peoples R China
[2] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Dept Automat, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22) | 2022年
基金
中国国家自然科学基金;
关键词
Sequential recommendation; Self-Supervised Learning; Contrastive Learning;
D O I
10.1145/3477495.3531918
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Widely applied in today's recommender systems, sequential recommendation predicts the next interacted item for a given user via his/her historical item sequence. However, sequential recommendation suffers data sparsity issue like most recommenders. To extract auxiliary signals from the data, some recent works exploit self-supervised learning to generate augmented data via dropout strategy, which, however, leads to sparser sequential data and obscure signals. In this paper, we propose Dual Contrastive Network (DCN) to boost sequential recommendation, from a new perspective of integrating auxiliary user-sequence for items. Specifically, we propose two kinds of contrastive learning. The first one is the dual representation contrastive learning that minimizes the distances between embeddings and sequence-representations of users/items. The second one is the dual interest contrastive learning which aims to self-supervise the static interest with the dynamic interest of next item prediction via auxiliary training. We also incorporate the auxiliary task of predicting next user for a given item's historical user sequence, which can capture the trends of items preferred by certain types of users. Experiments on benchmark datasets verify the effectiveness of our proposed method. Further ablation study also illustrates the boosting effect of the proposed components upon different sequential models.
引用
收藏
页码:2686 / 2691
页数:6
相关论文
共 29 条
[1]  
Bachman P, 2019, ADV NEUR IN, V32
[2]   Sequential Recommendation with Graph Neural Networks [J].
Chang, Jianxin ;
Gao, Chen ;
Zheng, Yu ;
Hui, Yiqun ;
Niu, Yanan ;
Song, Yang ;
Jin, Depeng ;
Li, Yong .
SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, :378-387
[3]  
Chung J., 2014, P NEURIPS, P1, DOI DOI 10.48550/ARXIV.1412.3555
[4]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[5]  
Hjelm RD, 2019, Arxiv, DOI arXiv:1808.06670
[6]  
Glorot X., 2010, P 13 INT C ART INT S, P249
[7]  
Gunawardana A., 2022, Recommender Systems Handbook, DOI [10.1007/978-1-0716-2197-4_15, 10.1007/978-1-0716-2197-415, DOI 10.1007/978-1-0716-2197-4-15]
[8]  
Hidasi B, 2016, Session-based Recommendations with Recurrent Neual NetWorks, DOI DOI 10.48550/ARXIV.1511.06939
[9]  
Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.8.1735, 10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
[10]   Self-Attentive Sequential Recommendation [J].
Kang, Wang-Cheng ;
McAuley, Julian .
2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, :197-206