In-depth Recommendation Model Based on Self-Attention Factorization

被引:4
作者
Ma, Hongshuang [1 ]
Liu, Qicheng [1 ]
机构
[1] Yantai Univ, Sch Comp & Control Engn, Yantai 264000, Shandong, Peoples R China
来源
KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS | 2023年 / 17卷 / 03期
基金
中国国家自然科学基金;
关键词
Self-attention network; deep learning; recommendation model; review text;
D O I
10.3837/tiis.2023.03.003
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rating prediction is an important issue in recommender systems, and its accuracy affects the experience of the user and the revenue of the company. Traditional recommender systems use Factorization Machines for rating predictions and each feature is selected with the same weight. Thus, there are problems with inaccurate ratings and limited data representation. This study proposes a deep recommendation model based on self-attention Factorization (SAFMR) to solve these problems. This model uses Convolutional Neural Networks to extract features from user and item reviews. The obtained features are fed into self-attention mechanism Factorization Machines, where the self-attention network automatically learns the dependencies of the features and distinguishes the weights of the different features, thereby reducing the prediction error. The model was experimentally evaluated using six classes of dataset. We compared MSE, NDCG and time for several real datasets. The experiment demonstrated that the SAFMR model achieved excellent rating prediction results and recommendation correlations, thereby verifying the effectiveness of the model.
引用
收藏
页码:721 / 739
页数:19
相关论文
共 50 条
  • [41] A Self-Attention Feature Fusion Model for Rice Pest Detection
    Li, Shuaifeng
    Wang, Heng
    Zhang, Cong
    Liu, Jie
    IEEE ACCESS, 2022, 10 : 84063 - 84077
  • [42] Generation Model of Character Posture TransferBased on Self-attention Mechanism
    Zhao Ning
    Liu Libo
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (04)
  • [43] Mineral Prospectivity Mapping Using Deep Self-Attention Model
    Yin, Bojun
    Zuo, Renguang
    Sun, Siquan
    NATURAL RESOURCES RESEARCH, 2023, 32 (01) : 37 - 56
  • [44] Integrating the Pre-trained Item Representations with Reformed Self-attention Network for Sequential Recommendation
    Liang, Guanzhong
    Liao, Jie
    Zhou, Wei
    Wen, Junhao
    2022 IEEE INTERNATIONAL CONFERENCE ON WEB SERVICES (IEEE ICWS 2022), 2022, : 27 - 36
  • [45] LwSANet: Light Weight Self-Attention Network Model to Recognize Fruits from Images
    Sathyadhas, Gracia Nissi
    Gladston, Angelin
    Nehemiah, Khanna H.
    TRAITEMENT DU SIGNAL, 2025, 42 (01) : 183 - 200
  • [46] Attention-based multi attribute matrix factorization for enhanced recommendation performance
    Jang, Dongsoo
    Li, Qinglong
    Lee, Chaeyoung
    Kim, Jaekyeong
    INFORMATION SYSTEMS, 2024, 121
  • [47] A novel self-attention model based on cosine self-similarity for cancer classification of protein mass spectrometry
    Tang, Long
    Xu, Ping
    Xue, Lingyun
    Liu, Yian
    Yan, Ming
    Chen, Anqi
    Hu, Shundi
    Wen, Luhong
    INTERNATIONAL JOURNAL OF MASS SPECTROMETRY, 2023, 494
  • [48] Pedestrian Attribute Recognition Based on Dual Self-attention Mechanism
    Fan, Zhongkui
    Guan, Ye-peng
    COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2023, 20 (02) : 793 - 812
  • [49] SELF-ATTENTION ALIGNER: A LATENCY-CONTROL END-TO-END MODEL FOR ASR USING SELF-ATTENTION NETWORK AND CHUNK-HOPPING
    Dong, Linhao
    Wang, Feng
    Xu, Bo
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5656 - 5660
  • [50] ResDeepSurv: A Survival Model for Deep Neural Networks Based on Residual Blocks and Self-attention Mechanism
    Wang, Yuchen
    Kong, Xianchun
    Bi, Xiao
    Cui, Lizhen
    Yu, Hong
    Wu, Hao
    INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2024, 16 (02) : 405 - 417