Long short-term enhanced memory for sequential recommendation

被引:23
作者
Duan, Jiasheng [1 ]
Zhang, Peng-Fei [1 ]
Qiu, Ruihong [1 ]
Huang, Zi [1 ]
机构
[1] Univ Queensland, Sch Informat Technol & Elect Engn, Brisbane, Qld, Australia
来源
WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS | 2023年 / 26卷 / 02期
关键词
Sequential recommendation; LSTM; Memory enhanced;
D O I
10.1007/s11280-022-01056-9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sequential recommendation is a stream of studies on recommender systems, which focuses on predicting the next item a user interacts with by modeling the dynamic sequence of user-item interactions. Since being born to explore the dynamic tendency of variable-length temporal sequence, Recurrent Neural Networks (RNNs) have been paid much attention in this area. However, the inherent defects caused by the network structure of RNNs have limited their applications in sequential recommendation, which are mainly shown on two factors: RNNs tend to make point-wise predictions and ignore the collective dependencies because the temporal relationships between items change monotonically; RNNs are likely to forget the essential information during processing long sequences. To solve these problems, researchers have done much work to enhance the memory mechanism of RNNs. However, although previous RNN-based methods have achieved promising performance by taking advantage of external knowledge with other advanced techniques, the improvement of the intrinsic property of existing RNNs has not been explored, which is still challenging. Therefore, in this work, we propose a novel architecture based on Long Short-Term Memories (LSTMs), a broadly-used variant of RNNs, specific for sequential recommendation, called Long Short-Term enhanced Memory (LSTeM), which boosts the memory mechanism of original LSTMs in two ways. Firstly, we design a new structure of gates in LSTMs by introducing a "Q-K-V" triplet, a mechanism to accurately and properly model the correlation between the current item and the user's historical behaviors at each time step. Secondly, we propose a "recover gate" to remedy the inadequacy of memory caused by the forgetting mechanism, which works with a dynamic global memory embedding. Extensive experiments have demonstrated that LSTeM achieves comparable performance to the state-of-the-art methods on the challenging datasets for sequential recommendation.
引用
收藏
页码:561 / 583
页数:23
相关论文
共 41 条
  • [1] Ba Jimmy Lei, 2016, LAYER NORMALIZATION, DOI 10.48550/arXiv.1607.06450
  • [2] Cho K, 2014, ARXIV14061078, P1724
  • [3] MV-RNN: A Multi-View Recurrent Neural Network for Sequential Recommendation
    Cui, Qiang
    Wu, Shu
    Liu, Qiang
    Zhong, Wen
    Wang, Liang
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (02) : 317 - 331
  • [4] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
  • [5] Sequential User-based Recurrent Neural Network Recommendations
    Donkers, Tim
    Loepp, Benedikt
    Ziegler, Juergen
    [J]. PROCEEDINGS OF THE ELEVENTH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS'17), 2017, : 152 - 160
  • [6] Dziugaite G.K., 2015, CORRABS151106443
  • [7] Graves A, 2012, STUD COMPUT INTELL, V385, P1, DOI [10.1007/978-3-642-24797-2, 10.1162/neco.1997.9.1.1]
  • [8] He RN, 2016, IEEE DATA MINING, P191, DOI [10.1109/ICDM.2016.0030, 10.1109/ICDM.2016.88]
  • [9] Neural Collaborative Filtering
    He, Xiangnan
    Liao, Lizi
    Zhang, Hanwang
    Nie, Liqiang
    Hu, Xia
    Chua, Tat-Seng
    [J]. PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'17), 2017, : 173 - 182
  • [10] Recurrent Neural Networks with Top-k Gains for Session-based Recommendations
    Hidasi, Balazs
    Karatzoglou, Alexandros
    [J]. CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, : 843 - 852