Personalized multi-head self-attention network for news recommendation

被引:1
|
作者
Zheng, Cong [1 ]
Song, Yixuan [2 ]
机构
[1] Global Energy Interconnect Grp Co Ltd, 8 Xuanwumennei St, Beijing, Peoples R China
[2] Baidu Online Network Technol Beijing Co Ltd, Baidu Campus,10 Shangdi 10th St, Beijing, Peoples R China
关键词
News recommendation; Natural language processing; Multi-head self-attention; Neural networks; Embedding;
D O I
10.1016/j.neunet.2024.106824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a P ersonalized M ulti-Head S elf-Attention N etwork ( PMSN ) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Personalized News Recommendation with CNN and Multi-Head Self-Attention
    Li, Aibin
    He, Tingnian
    Guo, Yi
    Li, Zhuoran
    Rong, Yixuan
    Liu, Guoqi
    2022 IEEE 13TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2022, : 102 - 108
  • [2] Personalized News Recommendation Method with Double-Layer Residual Connections and Double Multi-Head Self-Attention Mechanisms
    Zhang, Dehai
    Zhu, Zhaoyang
    Wang, Zhengwu
    Wang, Jianxin
    Xiao, Liang
    Chen, Yin
    Zhao, Di
    APPLIED SCIENCES-BASEL, 2024, 14 (13):
  • [3] Learning Contextual Features with Multi-head Self-attention for Fake News Detection
    Wang, Yangqian
    Han, Hao
    Ding, Ye
    Wang, Xuan
    Liao, Qing
    COGNITIVE COMPUTING - ICCC 2019, 2019, 11518 : 132 - 142
  • [4] Dual-stream fusion network with multi-head self-attention for multi-modal fake news detection
    Yang, Yimei
    Liu, Jinping
    Yang, Yujun
    Cen, Lihui
    APPLIED SOFT COMPUTING, 2024, 167
  • [5] MSnet: Multi-Head Self-Attention Network for Distantly Supervised Relation Extraction
    Sun, Tingting
    Zhang, Chunhong
    Ji, Yang
    Hu, Zheng
    IEEE ACCESS, 2019, 7 : 54472 - 54482
  • [6] DILATED RESIDUAL NETWORK WITH MULTI-HEAD SELF-ATTENTION FOR SPEECH EMOTION RECOGNITION
    Li, Runnan
    Wu, Zhiyong
    Jia, Jia
    Zhao, Sheng
    Meng, Helen
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 6675 - 6679
  • [7] Multi-modal multi-head self-attention for medical VQA
    Joshi, Vasudha
    Mitra, Pabitra
    Bose, Supratik
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (14) : 42585 - 42608
  • [8] Multi-modal multi-head self-attention for medical VQA
    Vasudha Joshi
    Pabitra Mitra
    Supratik Bose
    Multimedia Tools and Applications, 2024, 83 : 42585 - 42608
  • [9] An interactive multi-head self-attention capsule network model for aspect sentiment classification
    Lina She
    Hongfang Gong
    Siyu Zhang
    The Journal of Supercomputing, 2024, 80 : 9327 - 9352
  • [10] An interactive multi-head self-attention capsule network model for aspect sentiment classification
    She, Lina
    Gong, Hongfang
    Zhang, Siyu
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (07) : 9327 - 9352