In-depth Recommendation Model Based on Self-Attention Factorization

被引:4
作者
Ma, Hongshuang [1 ]
Liu, Qicheng [1 ]
机构
[1] Yantai Univ, Sch Comp & Control Engn, Yantai 264000, Shandong, Peoples R China
来源
KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS | 2023年 / 17卷 / 03期
基金
中国国家自然科学基金;
关键词
Self-attention network; deep learning; recommendation model; review text;
D O I
10.3837/tiis.2023.03.003
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rating prediction is an important issue in recommender systems, and its accuracy affects the experience of the user and the revenue of the company. Traditional recommender systems use Factorization Machines for rating predictions and each feature is selected with the same weight. Thus, there are problems with inaccurate ratings and limited data representation. This study proposes a deep recommendation model based on self-attention Factorization (SAFMR) to solve these problems. This model uses Convolutional Neural Networks to extract features from user and item reviews. The obtained features are fed into self-attention mechanism Factorization Machines, where the self-attention network automatically learns the dependencies of the features and distinguishes the weights of the different features, thereby reducing the prediction error. The model was experimentally evaluated using six classes of dataset. We compared MSE, NDCG and time for several real datasets. The experiment demonstrated that the SAFMR model achieved excellent rating prediction results and recommendation correlations, thereby verifying the effectiveness of the model.
引用
收藏
页码:721 / 739
页数:19
相关论文
共 50 条
  • [21] Modal parameter estimation of turbulence response based on self-attention generative model
    Duan, Shiqiang
    Zheng, Hua
    Yu, Jinge
    Wu, Yafeng
    JOURNAL OF VIBRATION AND CONTROL, 2024, 30 (15-16) : 3296 - 3309
  • [22] Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention
    Zhang, Zeyu
    Li, Bin
    Yan, Chenyang
    Furuichi, Kengo
    Todo, Yuki
    BIOMIMETICS, 2025, 10 (01)
  • [23] PC-SAN: Pretraining-Based Contextual Self-Attention Model for Topic Essay Generation
    Lin, Fuqiang
    Ma, Xingkong
    Chen, Yaofeng
    Zhou, Jiajun
    Liu, Bo
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2020, 14 (08) : 3168 - 3186
  • [24] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [25] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [26] Feature-Level Deeper Self-Attention Network With Contrastive Learning for Sequential Recommendation
    Hao, Yongjing
    Zhang, Tingting
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Liu, Guanfeng
    Zhou, Xiaofang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10112 - 10124
  • [27] A multimodal travel route recommendation system leveraging visual Transformers and self-attention mechanisms
    Juan, Zhang
    Zhang, Jing
    Gao, Ming
    FRONTIERS IN NEUROROBOTICS, 2024, 18
  • [28] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [29] Pest Identification Based on Fusion of Self-Attention With ResNet
    Hassan, Sk Mahmudul
    Maji, Arnab Kumar
    IEEE ACCESS, 2024, 12 : 6036 - 6050
  • [30] Finger Vein Recognition Based on ResNet With Self-Attention
    Zhang, Zhibo
    Chen, Guanghua
    Zhang, Weifeng
    Wang, Huiyang
    IEEE ACCESS, 2024, 12 : 1943 - 1951