Transformer-Based Rating-Aware Sequential Recommendation

被引:0
作者
Li, Yang [1 ]
Li, Qianmu [1 ]
Meng, Shunmei [1 ]
Hou, Jun [2 ,3 ]
机构
[1] Nanjing Univ Sci & Technol, Nanjing, Peoples R China
[2] Nanjing Vocat Univ Ind Technol, Sch Social Sci, Nanjing, Peoples R China
[3] Wuyi Univ, Intelligent Mfg Dept, Jiangmen, Peoples R China
来源
ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT I | 2022年 / 13155卷
关键词
Recommender system; Sequential recommendation; Transformer; Attention mechanism; Big data application; Data analytics;
D O I
10.1007/978-3-030-95384-3_47
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
As a significant application of Big Data, recommender system can effectively solve information overload. The user's behavior sequence forms massive data and has excellent mining value. Sequential recommendation is to extract user's features in massive sequential data and predict the next interaction based on the user's recent temporal behavior. Currently, recurrent neural networks (RNN) and Graph Neural Networks (GNN) take on the role of item embedding in sequential recommendation and have shown adequate performance. However, such RNN based model and GNN based model cannot deeply mine the complex behavior sequence and neglect user preference like rating information. Inspired by the popular Transformer, we adopt the Transformer encoder layer to process sequence and represent item embedding by multi-head attention. Meanwhile, rating information is integrated into weight calculation when we represent the user preference with self-attention. Weight with rating not only retains the structural information of sequence but also combines the user's preferences. What's more, we consider global and local preferences to formulate hybrid performance and make recommendations in Top-N. For persuasiveness, we conduct experiments on large real-world datasets, and our model performs better in most cases on two datasets compared to state-of-the-art methods.
引用
收藏
页码:759 / 774
页数:16
相关论文
共 25 条
  • [1] Badrul S., 2001, P 10 INT C WORLD WID, P285, DOI DOI 10.1145/371920.372071
  • [2] Cai CW, 2017, PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P1476
  • [3] Da Conceicao Moreira P.S, 2018, IC3K2018 P 10 INT JO, P253
  • [4] Neural Collaborative Filtering
    He, Xiangnan
    Liao, Lizi
    Zhang, Hanwang
    Nie, Liqiang
    Hu, Xia
    Chua, Tat-Seng
    [J]. PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'17), 2017, : 173 - 182
  • [5] Hidasi B., 2016, International Conference on Learning Representations (ICLR 2016)
  • [6] Self-Attentive Sequential Recommendation
    Kang, Wang-Cheng
    McAuley, Julian
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 197 - 206
  • [7] Neural Attentive Session-based Recommendation
    Li, Jing
    Ren, Pengjie
    Chen, Zhumin
    Ren, Zhaochun
    Lian, Tao
    Ma, Jun
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 1419 - 1428
  • [8] STAMP: Short-Term Attention/Memory Priority Model for Session-based Recommendation
    Liu, Qiao
    Zeng, Yifu
    Mokhosi, Refuoe
    Zhang, Haibin
    [J]. KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 1831 - 1839
  • [9] Sequential recommendation with metric models based on frequent sequences
    Lonjarret, Corentin
    Auburtin, Roch
    Robardet, Celine
    Plantevit, Marc
    [J]. DATA MINING AND KNOWLEDGE DISCOVERY, 2021, 35 (03) : 1087 - 1133
  • [10] Security-Driven hybrid collaborative recommendation method for cloud-based iot services
    Meng, Shunmei
    Gao, Zijian
    Li, Qianmu
    Wang, Hao
    Dai, Hong-Ning
    Qi, Lianyong
    [J]. COMPUTERS & SECURITY, 2020, 97