Personalized multi-head self-attention network for news recommendation

被引:1
|
作者
Zheng, Cong [1 ]
Song, Yixuan [2 ]
机构
[1] Global Energy Interconnect Grp Co Ltd, 8 Xuanwumennei St, Beijing, Peoples R China
[2] Baidu Online Network Technol Beijing Co Ltd, Baidu Campus,10 Shangdi 10th St, Beijing, Peoples R China
关键词
News recommendation; Natural language processing; Multi-head self-attention; Neural networks; Embedding;
D O I
10.1016/j.neunet.2024.106824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a P ersonalized M ulti-Head S elf-Attention N etwork ( PMSN ) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] A multi-head self-attention autoencoder network for fault detection of wind turbine gearboxes under random loads
    Yu, Xiaoxia
    Zhang, Zhigang
    Tang, Baoping
    Zhao, Minghang
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (08)
  • [32] A novel two-stream multi-head self-attention convolutional neural network for bearing fault diagnosis
    Ren, Hang
    Liu, Shaogang
    Wei, Fengmei
    Qiu, Bo
    Zhao, Dan
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE, 2024, 238 (11) : 5393 - 5405
  • [33] Class token and knowledge distillation for multi-head self-attention speaker verification systems
    Mingote, Victoria
    Miguel, Antonio
    Ortega, Alfonso
    Lleida, Eduardo
    DIGITAL SIGNAL PROCESSING, 2023, 133
  • [34] Click-Through Rate Prediction of Multi-Head Self-Attention in Hyperbolic Space
    Han Y.-L.
    Wang X.-Y.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2021, 44 (05): : 127 - 132
  • [35] Phishing websites detection via CNN and multi-head self-attention on imbalanced datasets
    Xiao, Xi
    Xiao, Wentao
    Zhang, Dianyan
    Zhang, Bin
    Hu, Guangwu
    Li, Qing
    Xia, Shutao
    COMPUTERS & SECURITY, 2021, 108 (108)
  • [36] CNN-MHSA: A Convolutional Neural Network and multi-head self-attention combined approach for detecting phishing websites
    Xiao, Xi
    Zhang, Dianyan
    Hu, Guangwu
    Jiang, Yong
    Xia, Shutao
    NEURAL NETWORKS, 2020, 125 : 303 - 312
  • [37] Detection for domain generation algorithm (DGA) domain botnet based on neural network with multi-head self-attention mechanisms
    Sarojini, S.
    Asha, S.
    INTERNATIONAL JOURNAL OF SYSTEM ASSURANCE ENGINEERING AND MANAGEMENT, 2022,
  • [38] Self Multi-Head Attention for Speaker Recognition
    India, Miquel
    Safari, Pooyan
    Hernando, Javier
    INTERSPEECH 2019, 2019, : 4305 - 4309
  • [39] A spatial-spectral fusion convolutional transformer network with contextual multi-head self-attention for hyperspectral image classification
    Wang, Wuli
    Sun, Qi
    Zhang, Li
    Ren, Peng
    Wang, Jianbu
    Ren, Guangbo
    Liu, Baodi
    NEURAL NETWORKS, 2025, 187
  • [40] Automatic segmentation of golden pomfret based on fusion of multi-head self-attention and channel-attention mechanism
    Yu, Guoyan
    Luo, Yingtong
    Deng, Ruoling
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 202