Personalized multi-head self-attention network for news recommendation

被引:1
|
作者
Zheng, Cong [1 ]
Song, Yixuan [2 ]
机构
[1] Global Energy Interconnect Grp Co Ltd, 8 Xuanwumennei St, Beijing, Peoples R China
[2] Baidu Online Network Technol Beijing Co Ltd, Baidu Campus,10 Shangdi 10th St, Beijing, Peoples R China
关键词
News recommendation; Natural language processing; Multi-head self-attention; Neural networks; Embedding;
D O I
10.1016/j.neunet.2024.106824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a P ersonalized M ulti-Head S elf-Attention N etwork ( PMSN ) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [32] Enlivening Redundant Heads in Multi-head Self-attention for Machine Translation
    Zhang, Tianfu
    Huang, Heyan
    Feng, Chong
    Cao, Longbing
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3238 - 3248
  • [33] Convolutional multi-head self-attention on memory for aspect sentiment classification
    Zhang, Yaojie
    Xu, Bing
    Zhao, Tiejun
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (04) : 1038 - 1044
  • [34] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    Multimedia Tools and Applications, 2021, 80 : 12581 - 12600
  • [35] An integrated multi-head dual sparse self-attention network for remaining useful life prediction
    Zhang, Jiusi
    Li, Xiang
    Tian, Jilun
    Luo, Hao
    Yin, Shen
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 233
  • [36] A Model for Sea Ice Segmentation based on Feature Pyramid Network and Multi-head Self-attention
    Xu, Yuanxiang
    Feng, Yuan
    Song, Shengyu
    Liu, Jiahao
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 97 - 102
  • [37] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Leng, Xue-Liang
    Miao, Xiao-Ai
    Liu, Tao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12581 - 12600
  • [38] AttenEpilepsy: A 2D convolutional network model based on multi-head self-attention
    Ma, Shuang
    Wang, Haifeng
    Yu, Zhihao
    Du, Luyao
    Zhang, Ming
    Fu, Qingxi
    ENGINEERING ANALYSIS WITH BOUNDARY ELEMENTS, 2024, 169
  • [39] MSASGCN : Multi-Head Self-Attention Spatiotemporal Graph Convolutional Network for Traffic Flow Forecasting
    Cao, Yang
    Liu, Detian
    Yin, Qizheng
    Xue, Fei
    Tang, Hengliang
    JOURNAL OF ADVANCED TRANSPORTATION, 2022, 2022
  • [40] Detection of malicious URLs using Temporal Convolutional Network and Multi-Head Self-Attention mechanism
    Nguyet Quang Do
    Selamat, Ali
    Krejcar, Ondrej
    Fujita, Hamido
    APPLIED SOFT COMPUTING, 2025, 169