Enhanced Self-Attention Mechanism for Long and Short Term Sequential Recommendation Models

被引:2
作者
Zheng, Xiaoyao [1 ,2 ]
Li, Xingwang [1 ,2 ]
Chen, Zhenghua [1 ,2 ,3 ]
Sun, Liping [1 ,2 ]
Yu, Qingying [1 ,2 ]
Guo, Liangmin [1 ,2 ]
Luo, Yonglong [1 ,2 ]
机构
[1] Anhui Normal Univ, Anhui Prov Key Lab Network & Informat Secur, Wuhu 241002, Peoples R China
[2] Anhui Normal Univ, Sch Comp & Informat, Wuhu 241002, Peoples R China
[3] Agcy Sci Technol Infocomm Res I2R & Res ASTAR, Inst Infocomm Res, Singapore 138632, Singapore
来源
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE | 2024年 / 8卷 / 03期
关键词
Sequential recommendation; enhanced self-attention mechanism; gated recurrent unit; position weight; NETWORK;
D O I
10.1109/TETCI.2024.3366771
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Compared with traditional recommendation algorithms based on collaborative filtering and content, the sequential recommendation can better capture changes in user interests and recommend items that may be interacted with by the next time according to the user's historical interaction behaviors. Generally, there are several traditional methods for sequential recommendation: Markov Chain (MC) and Deep Neutral Network (DNN), both of which ignore the relationship between various behaviors and the dynamic changes of user interest in items over time. Furthermore, the early research methods usually deal with the user's historical interaction behavior in chronological order, which may cause the loss of partial preference information. According to the perspective that user preferences will change over time, this paper proposes a long and short-term sequential recommendation model with the enhanced self-attention network, RP-SANRec. The short-term intent module of RP-SANRec uses the Gated Recurrent Unit (GRU) to learn the comprehensive historical interaction sequence of the user to calculate the position weight information in the time order, which can be used to enhance the input of the self-attention mechanism. The long-term module captures the user's preferences through a bidirectional long and short-term memory network (Bi-LSTM). Finally, the user's dynamic interests and general preferences are fused, and the following recommendation result is predicted. This article applies the RP-SANRec model to three different public datasets under two evaluation indicators of HR@10 and NDCG@10. The extensive experiments proved that our proposed RP-SANRec model performs better than existing models.
引用
收藏
页码:2457 / 2466
页数:10
相关论文
共 50 条
  • [41] Modeling Long- and Short-Term User Behaviors for Sequential Recommendation with Deep Neural Networks
    Yan, Cairong
    Wang, Yiwei
    Zhang, Yanting
    Wang, Zijian
    Wang, Pengwei
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [42] A performance degradation prediction model for PEMFC based on bi-directional long short-term memory and multi-head self-attention mechanism
    Jia, Chunchun
    He, Hongwen
    Zhou, Jiaming
    Li, Kunang
    Li, Jianwei
    Wei, Zhongbao
    INTERNATIONAL JOURNAL OF HYDROGEN ENERGY, 2024, 60 : 133 - 146
  • [43] Enhanced Attention Framework for Multi-Interest Sequential Recommendation
    Yin, Dapeng
    Feng, Shuang
    IEEE ACCESS, 2022, 10 : 67703 - 67712
  • [44] Deja vu: A Contextualized Temporal Attention Mechanism for Sequential Recommendation
    Wu, Jibang
    Cai, Renqin
    Wang, Hongning
    WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 2199 - 2209
  • [45] A novel approach for remaining useful life prediction of high-reliability equipment based on long short-term memory and multi-head self-attention mechanism
    Al-Dahidi, Sameer
    Rashed, Mohammad
    Abu-Shams, Mohammad
    Mellal, Mohamed Arezki
    Alrbai, Mohammad
    Ramadan, Saleem
    Zio, Enrico
    QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, 2024, 40 (02) : 948 - 969
  • [46] Sequence and graph structure co-awareness via gating mechanism and self-attention for session-based recommendation
    Jingjing Qiao
    Li Wang
    Liguo Duan
    International Journal of Machine Learning and Cybernetics, 2021, 12 : 2591 - 2605
  • [47] Self-Attention Enhanced Recurrent Neural Networks for Sentence Classification
    Kumar, Ankit
    Rastogi , Reshma
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 905 - 911
  • [48] Long- and Short-Term Preference Model Based on Graph Embedding for Sequential Recommendation
    Liu, Yu
    Zhu, Haiping
    Chen, Yan
    Tian, Feng
    Ma, Dailusi
    Zeng, Jiangwei
    Zheng, Qinghua
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2020, 2020, 12115 : 241 - 257
  • [49] Sequence and graph structure co-awareness via gating mechanism and self-attention for session-based recommendation
    Qiao, Jingjing
    Wang, Li
    Duan, Liguo
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (09) : 2591 - 2605
  • [50] Self-Attention ConvLSTM for Spatiotemporal Forecasting of Short-Term Online Car-Hailing Demand
    Ge, Hongxia
    Li, Siteng
    Cheng, Rongjun
    Chen, Zhenlei
    SUSTAINABILITY, 2022, 14 (12)