Improving Sequential Recommendation via Subsequence Extraction

被引:1
|
作者
Deng, Hangyu [1 ]
Hu, Jinglu [1 ]
机构
[1] Waseda Univ, Grad Sch Informat Prod & Syst, 2-7 Hibikino, Kitakyushu, Fukuoka 8080135, Japan
来源
2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2022年
关键词
recommendation systems; sequential recommendation; sequence model;
D O I
10.1109/IJCNN55064.2022.9892221
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The temporal order of user behaviors, which implies the user's preference in the near future, plays a key role in sequential recommendation systems. To capture such patterns from user behavior sequences, many recent works borrow ideas from language models and consider it a next item prediction problem. It is reasonable, but the gap between the user behavior data and the text data is ignored. Generally speaking, user behaviors are more arbitrary than sentences in natural languages. A behavior sequence usually carries multiple intentions, and the exact order does not matter a lot. But a sentence in a text tends to express one meaning and different orders of the words may bring very different meanings. To address these issues, this study considers user behavior as a mixture of multiple subsequences. Specifically, we introduce a subsequence extraction module, which assigns the items in a sequence into different subsequences, with respect to their relationship. Then these subsequences are fed into the downstream sequence model, from which we obtain several user representations. To train the whole system in an end-to-end manner, we design a new training strategy where only the user representation near the target item gets supervised. To verify the effectiveness of our method, we conduct extensive experiments on four public datasets. It is compared with several baselines and achieves better results in most cases. Further experiments explore the properties of our model and we also visualize the result of the subsequence extraction.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] SSE4Rec: Sequential recommendation with subsequence extraction
    Deng, Hangyu
    Hu, Jinglu
    KNOWLEDGE-BASED SYSTEMS, 2024, 285
  • [2] Predictability-Aware Subsequence Modeling for Sequential Recommendation
    Deng, Hangyu
    Hu, Jinglu
    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2024, 19 (08) : 1396 - 1404
  • [3] Improving Sequential Recommendation with Global Item Transitions and Local Subsequences
    Deng, Hangyu
    Hu, Jinglu
    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2024, 19 (01) : 100 - 108
  • [4] Improving current interest with item and review sequential patterns for sequential recommendation
    Zhang, Jinjin
    Mu, Xiaodong
    Zhao, Peng
    Kang, Kai
    Ma, Chenhui
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2021, 104
  • [5] Improving Sequential Recommendation with Knowledge-Enhanced Memory Networks
    Huang, Jin
    Zhao, Wayne Xin
    Dou, Hongjian
    Wen, Ji-Rong
    Chang, Edward Y.
    ACM/SIGIR PROCEEDINGS 2018, 2018, : 505 - 514
  • [6] ENHANCING SEQUENTIAL RECOMMENDATION MODELING VIA ADVERSARIAL TRAINING
    Zhang, Yabin
    Chen, Xu
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,
  • [7] Quantifying predictability of sequential recommendation via logical constraints
    Xu, En
    Yu, Zhiwen
    Li, Nuo
    Cui, Helei
    Yao, Lina
    Guo, Bin
    FRONTIERS OF COMPUTER SCIENCE, 2023, 17 (05)
  • [8] Enhancing Sequential Recommendation via Decoupled Knowledge Graphs
    Wu, Bingchao
    Deng, Chenglong
    Guan, Bei
    Wang, Yongji
    Kangyang, Yuxuan
    SEMANTIC WEB, ESWC 2022, 2022, 13261 : 3 - 20
  • [9] CoBERT: Scientific Collaboration Prediction via Sequential Recommendation
    Koopmann, Tobias
    Kobs, Konstantin
    Herud, Konstantin
    Hotho, Andreas
    21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 45 - 54
  • [10] Improving graph neural network for session-based recommendation system via non-sequential interactions
    Gwadabe, Tajuddeen Rabiu
    Liu, Ying
    NEUROCOMPUTING, 2022, 468 : 111 - 122