Personalized multi-head self-attention network for news recommendation

被引:1
|
作者
Zheng, Cong [1 ]
Song, Yixuan [2 ]
机构
[1] Global Energy Interconnect Grp Co Ltd, 8 Xuanwumennei St, Beijing, Peoples R China
[2] Baidu Online Network Technol Beijing Co Ltd, Baidu Campus,10 Shangdi 10th St, Beijing, Peoples R China
关键词
News recommendation; Natural language processing; Multi-head self-attention; Neural networks; Embedding;
D O I
10.1016/j.neunet.2024.106824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a P ersonalized M ulti-Head S elf-Attention N etwork ( PMSN ) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Personalized News Recommendation with CNN and Multi-Head Self-Attention
    Li, Aibin
    He, Tingnian
    Guo, Yi
    Li, Zhuoran
    Rong, Yixuan
    Liu, Guoqi
    2022 IEEE 13TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2022, : 102 - 108
  • [2] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [3] Personalized News Recommendation Method with Double-Layer Residual Connections and Double Multi-Head Self-Attention Mechanisms
    Zhang, Dehai
    Zhu, Zhaoyang
    Wang, Zhengwu
    Wang, Jianxin
    Xiao, Liang
    Chen, Yin
    Zhao, Di
    APPLIED SCIENCES-BASEL, 2024, 14 (13):
  • [4] Multi-head enhanced self-attention network for novelty detection
    Zhang, Yingying
    Gong, Yuxin
    Zhu, Haogang
    Bai, Xiao
    Tang, Wenzhong
    PATTERN RECOGNITION, 2020, 107
  • [5] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [6] Adaptive Pruning for Multi-Head Self-Attention
    Messaoud, Walid
    Trabelsi, Rim
    Cabani, Adnane
    Abdelkefi, Fatma
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 48 - 57
  • [7] Research on Enhanced Multi-head Self-Attention Social Recommendation Algorithm Based on Graph Neural Network
    Teng, Yue
    Yang, Kai
    IAENG International Journal of Computer Science, 2024, 51 (07) : 754 - 764
  • [8] Learning Contextual Features with Multi-head Self-attention for Fake News Detection
    Wang, Yangqian
    Han, Hao
    Ding, Ye
    Wang, Xuan
    Liao, Qing
    COGNITIVE COMPUTING - ICCC 2019, 2019, 11518 : 132 - 142
  • [9] MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding
    Park, Geondo
    Han, Chihye
    Kim, Daeshik
    Yoon, Wonjun
    2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 1507 - 1515
  • [10] Dual-stream fusion network with multi-head self-attention for multi-modal fake news detection
    Yang, Yimei
    Liu, Jinping
    Yang, Yujun
    Cen, Lihui
    APPLIED SOFT COMPUTING, 2024, 167