Personalized multi-head self-attention network for news recommendation

被引:1
|
作者
Zheng, Cong [1 ]
Song, Yixuan [2 ]
机构
[1] Global Energy Interconnect Grp Co Ltd, 8 Xuanwumennei St, Beijing, Peoples R China
[2] Baidu Online Network Technol Beijing Co Ltd, Baidu Campus,10 Shangdi 10th St, Beijing, Peoples R China
关键词
News recommendation; Natural language processing; Multi-head self-attention; Neural networks; Embedding;
D O I
10.1016/j.neunet.2024.106824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a P ersonalized M ulti-Head S elf-Attention N etwork ( PMSN ) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] MS-Pointer Network: Abstractive Text Summary Based on Multi-Head Self-Attention
    Guo, Qian
    Huang, Jifeng
    Xiong, Naixue
    Wang, Pan
    IEEE ACCESS, 2019, 7 : 138603 - 138613
  • [42] Lane Detection Method Based on Improved Multi-Head Self-Attention
    Ge, Zekun
    Tao, Fazhan
    Fu, Zhumu
    Song, Shuzhong
    Computer Engineering and Applications, 60 (02): : 264 - 271
  • [43] MSIN: An Efficient Multi-head Self-attention Framework for Inertial Navigation
    Shi, Gaotao
    Pan, Bingjia
    Ni, Yuzhi
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT I, 2024, 14487 : 455 - 473
  • [44] Local Multi-Head Channel Self-Attention for Facial Expression Recognition
    Pecoraro, Roberto
    Basile, Valerio
    Bono, Viviana
    INFORMATION, 2022, 13 (09)
  • [45] Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification
    Yaojie Zhang
    Bing Xu
    Tiejun Zhao
    IEEE/CAAJournalofAutomaticaSinica, 2020, 7 (04) : 1038 - 1044
  • [46] SQL Injection Detection Based on Lightweight Multi-Head Self-Attention
    Lo, Rui-Teng
    Hwang, Wen-Jyi
    Tai, Tsung-Ming
    APPLIED SCIENCES-BASEL, 2025, 15 (02):
  • [47] An Effective Hyperspectral Image Classification Network Based on Multi-Head Self-Attention and Spectral-Coordinate Attention
    Zhang, Minghua
    Duan, Yuxia
    Song, Wei
    Mei, Haibin
    He, Qi
    JOURNAL OF IMAGING, 2023, 9 (07)
  • [48] Speech enhancement method based on the multi-head self-attention mechanism
    Chang X.
    Zhang Y.
    Yang L.
    Kou J.
    Wang X.
    Xu D.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [49] Hunt for Unseen Intrusion: Multi-Head Self-Attention Neural Detector
    Seo, Seongyun
    Han, Sungmin
    Park, Janghyeon
    Shim, Shinwoo
    Ryu, Han-Eul
    Cho, Byoungmo
    Lee, Sangkyun
    IEEE ACCESS, 2021, 9 : 129635 - 129647
  • [50] A multi-head self-attention deep learning approach for detection and recommendation of neuromagnetic high frequency oscillations in epilepsy
    Zhao, Xiangyu
    Peng, Xueping
    Niu, Ke
    Li, Hailong
    He, Lili
    Yang, Feng
    Wu, Ting
    Chen, Duo
    Zhang, Qiusi
    Ouyang, Menglin
    Guo, Jiayang
    Pan, Yijie
    FRONTIERS IN NEUROINFORMATICS, 2022, 16