Personalized multi-head self-attention network for news recommendation

被引:1
|
作者
Zheng, Cong [1 ]
Song, Yixuan [2 ]
机构
[1] Global Energy Interconnect Grp Co Ltd, 8 Xuanwumennei St, Beijing, Peoples R China
[2] Baidu Online Network Technol Beijing Co Ltd, Baidu Campus,10 Shangdi 10th St, Beijing, Peoples R China
关键词
News recommendation; Natural language processing; Multi-head self-attention; Neural networks; Embedding;
D O I
10.1016/j.neunet.2024.106824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid explosion of online news and user population, personalized news recommender systems have proved to be efficient ways of alleviating information overload problems by suggesting information which attracts users in line with their tastes. Exploring relationships among words and news is critical to structurally model users' latent tastes including interested domains, while selecting informative words and news can directly reflect users' interests. Most of the current studies do not provide an effective framework that combines distilling users' interested latent spaces and explicit points systematically. Moreover, introducing more advanced techniques to merely chase accuracy has become a universal phenomenon. In this study, we design a P ersonalized M ulti-Head S elf-Attention N etwork ( PMSN ) for news recommendation, which combines multi-head self-attention network with personalized attention mechanism from both word and news levels. Multi-head self-attention mechanism is used to model interactions among words and news, exploring latent interests. Personalized attention mechanism is applied by embedding users' IDs to highlight informative words and news, which can enhance the interpretability of personalization. Comprehensive experiments conducted using two real-world datasets demonstrate that PMSN efficiently outperforms state-of-the-art methods in terms of recommendation accuracy, without complicated structure design and exhausted even external resources consumption. Furthermore, visualized case study validates that attention mechanism indeed increases the interpretability.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] A Multi-tab Webpage Fingerprinting Method Based on Multi-head Self-attention
    Xie, Lixia
    Li, Yange
    Yang, Hongyu
    Hu, Ze
    Wang, Peng
    Cheng, Xiang
    Zhang, Liang
    FRONTIERS IN CYBER SECURITY, FCS 2023, 2024, 1992 : 131 - 140
  • [22] A HYBRID TEXT NORMALIZATION SYSTEM USING MULTI-HEAD SELF-ATTENTION FOR MANDARIN
    Zhang, Junhui
    Pan, Junjie
    Yin, Xiang
    Li, Chen
    Liu, Shichao
    Zhang, Yang
    Wang, Yuxuan
    Ma, Zejun
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 6694 - 6698
  • [23] MASPP and MWASP: multi-head self-attention based modules for UNet network in melon spot segmentation
    Tran, Khoa-Dang
    Ho, Trang-Thi
    Huang, Yennun
    Le, Nguyen Quoc Khanh
    Tuan, Le Quoc
    Ho, Van Lam
    JOURNAL OF FOOD MEASUREMENT AND CHARACTERIZATION, 2024, 18 (5) : 3935 - 3949
  • [24] Chinese CNER Combined with Multi-head Self-attention and BiLSTM-CRF
    Luo X.
    Xia X.
    An Y.
    Chen X.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2021, 48 (04): : 45 - 55
  • [25] Lip Recognition Based on Bi-GRU with Multi-Head Self-Attention
    Ni, Ran
    Jiang, Haiyang
    Zhou, Lu
    Lu, Yuanyao
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, PT III, AIAI 2024, 2024, 713 : 99 - 110
  • [26] Multi-Head Self-Attention Model for Classification of Temporal Lobe Epilepsy Subtypes
    Gu, Peipei
    Wu, Ting
    Zou, Mingyang
    Pan, Yijie
    Guo, Jiayang
    Xiahou, Jianbing
    Peng, Xueping
    Li, Hailong
    Ma, Junxia
    Zhang, Ling
    FRONTIERS IN PHYSIOLOGY, 2020, 11
  • [27] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119
  • [28] Fast Neural Chinese Named Entity Recognition with Multi-head Self-attention
    Qi, Tao
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Liu, Junxin
    Huang, Yongfeng
    Xie, Xing
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE COMPUTING AND LANGUAGE UNDERSTANDING, 2019, 1134 : 98 - 110
  • [29] MBGAN: An improved generative adversarial network with multi-head self-attention and bidirectional RNN for time series imputation
    Ni, Qingjian
    Cao, Xuehan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 115
  • [30] EEG-Based Emotion Recognition Using Convolutional Recurrent Neural Network with Multi-Head Self-Attention
    Hu, Zhangfang
    Chen, Libujie
    Luo, Yuan
    Zhou, Jingfan
    APPLIED SCIENCES-BASEL, 2022, 12 (21):