A Recommendation Algorithm Based on a Self-supervised Learning Pretrain Transformer

被引:0
作者
Yu-Hao Xu
Zhen-Hai Wang
Zhi-Ru Wang
Rong Fan
Xing Wang
机构
[1] Linyi University,College of Information Science and Engineering
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Self-supervised learning; Pretraining; Sequential recommendation; Transformer;
D O I
暂无
中图分类号
学科分类号
摘要
Click-through rate (CTR) prediction is crucial research direction for the recommendation, with the goal of predicting the probability that users will click on candidate items. There are studies indicates that users’ next click behavior is influenced by their last few clicks; therefore, effective modeling of user behavior sequences to extract user interest representations is an important research topic in CTR prediction. Various networks such as RNN, and transformer, have been applied to implicitly extract user interest in the sequence. However, these studies focus on designing complex network structures for better user behavior modeling, while ignoring the fact that the training methods used in current CTR prediction models may limit the model performance. Specifically, owing to the single training objective of the CTR prediction model, the sequence interest extractor component in the model will not be fully trained due to overemphasis on the final prediction effect during the training process. To solve this issue, this paper proposes a recommendation model based on self-supervised learning to pretrain the transformer (SSPT4Rec), which divides the training into two phases: pretraining and fine-tuning. The transformer is trained by a four-classification pretext task in the pretraining phase, and the weights obtained from the pretraining are used to initialize the transformer in the fine-tuning phase and to fine-tune it in the recommendation task. Extensive experiments on four publicly available datasets reveal that the SSPT4Rec method improves the feature extraction capability of transformer as an interest extractor and outperforms the existing model.
引用
收藏
页码:4481 / 4497
页数:16
相关论文
共 17 条
[1]  
Fang H(2020)Deep learning for sequential recommendation: Algorithms, influential factors, and evaluations ACM Trans Inf Syst (TOIS) 39 1-42
[2]  
Zhang D(2021)A survey on conversational recommender systems ACM Comput Surv (CSUR) 54 1-36
[3]  
Shu Y(2019)GACOforRec: session-based graph convolutional neural networks recommendation model IEEE Access 7 114077-114085
[4]  
Jannach D(2020)A survey on contrastive self-supervised learning Technologies 9 2-6839
[5]  
Manzoor A(2020)What makes for good views for contrastive learning? Adv Neural Inf Process Syst 33 6827-6008
[6]  
Cai W(2017)Attention is all you need Adv Neural Inf Process Syst 30 5998-undefined
[7]  
Zhang M(undefined)undefined undefined undefined undefined-undefined
[8]  
Yang ZJIA(undefined)undefined undefined undefined undefined-undefined
[9]  
Jaiswal A(undefined)undefined undefined undefined undefined-undefined
[10]  
Babu AR(undefined)undefined undefined undefined undefined-undefined