SanMove: next location recommendation via self-attention network

被引:1
作者
Wang, Bin [1 ]
Li, Huifeng [1 ]
Tong, Le [1 ]
Zhang, Qian [1 ]
Zhu, Sulei [1 ]
Yang, Tao [2 ]
机构
[1] Shanghai Normal Univ, Shanghai, Peoples R China
[2] Shanghai Urban & Rural Construct & Traff Dev Res I, Shanghai, Peoples R China
关键词
Next location prediction; Self-attention network; Auxiliary information; PREDICTION;
D O I
10.1108/DTA-03-2022-0093
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
PurposeThis paper aims to address the following issues: (1) most existing methods are based on recurrent network, which is time-consuming to train long sequences due to not allowing for full parallelism; (2) personalized preference generally are not considered reasonably; (3) existing methods rarely systematically studied how to efficiently utilize various auxiliary information (e.g. user ID and time stamp) in trajectory data and the spatiotemporal relations among nonconsecutive locations.Design/methodology/approachThe authors propose a novel self-attention network-based model named SanMove to predict the next location via capturing the long- and short-term mobility patterns of users. Specifically, SanMove uses a self-attention module to capture each user's long-term preference, which can represent her personalized location preference. Meanwhile, the authors use a spatial-temporal guided noninvasive self-attention (STNOVA) module to exploit auxiliary information in the trajectory data to learn the user's short-term preference.FindingsThe authors evaluate SanMove on two real-world datasets. The experimental results demonstrate that SanMove is not only faster than the state-of-the-art recurrent neural network (RNN) based predict model but also outperforms the baselines for next location prediction.Originality/valueThe authors propose a self-attention-based sequential model named SanMove to predict the user's trajectory, which comprised long-term and short-term preference learning modules. SanMove allows full parallel processing of trajectories to improve processing efficiency. They propose an STNOVA module to capture the sequential transitions of current trajectories. Moreover, the self-attention module is used to process historical trajectory sequences in order to capture the personalized location preference of each user. The authors conduct extensive experiments on two check-in datasets. The experimental results demonstrate that the model has a fast training speed and excellent performance compared with the existing RNN-based methods for next location prediction.
引用
收藏
页码:330 / 343
页数:14
相关论文
共 50 条
  • [1] Spiking neural self-attention network for sequence recommendation
    Bai, Xinzhu
    Huang, Yanping
    Peng, Hong
    Yang, Qian
    Wang, Jun
    Liu, Zhicai
    APPLIED SOFT COMPUTING, 2025, 169
  • [2] CSAN: Contextual Self-Attention Network for User Sequential Recommendation
    Huang, Xiaowen
    Qian, Shengsheng
    Fang, Quan
    Sang, Jitao
    Xu, Changsheng
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 447 - 455
  • [3] Modeling Periodic Pattern with Self-Attention Network for Sequential Recommendation
    Ma, Jun
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhao, Lei
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2020), PT III, 2020, 12114 : 557 - 572
  • [4] Graph contextualized self-attention network for software service sequential recommendation
    Fu, Zixuan
    Wang, Chenghua
    Xu, Jiajie
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 149 : 509 - 517
  • [5] A Dual-View Knowledge Enhancing Self-Attention Network for Sequential Recommendation
    Tang, Hao
    Zhang, Feng
    Xu, Xinhai
    Zhang, Jieyuan
    Liu, Donghong
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 832 - 839
  • [6] Long- and short-term self-attention network for sequential recommendation
    Xu, Chengfeng
    Feng, Jian
    Zhao, Pengpeng
    Zhuang, Fuzhen
    Wang, Deqing
    Liu, Yanchi
    Sheng, Victor S.
    NEUROCOMPUTING, 2021, 423 : 580 - 589
  • [7] Self-Attention Network for Session-Based Recommendation With Streaming Data Input
    Sun, Shiming
    Tang, Yuanhe
    Dai, Zemei
    Zhou, Fu
    IEEE ACCESS, 2019, 7 : 110499 - 110509
  • [8] Feature-Level Deeper Self-Attention Network With Contrastive Learning for Sequential Recommendation
    Hao, Yongjing
    Zhang, Tingting
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Liu, Guanfeng
    Zhou, Xiaofang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10112 - 10124
  • [9] Integrating the Pre-trained Item Representations with Reformed Self-attention Network for Sequential Recommendation
    Liang, Guanzhong
    Liao, Jie
    Zhou, Wei
    Wen, Junhao
    2022 IEEE INTERNATIONAL CONFERENCE ON WEB SERVICES (IEEE ICWS 2022), 2022, : 27 - 36
  • [10] In-depth Recommendation Model Based on Self-Attention Factorization
    Ma, Hongshuang
    Liu, Qicheng
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2023, 17 (03): : 721 - 739