MulAttenRec: A Multi-level Attention-Based Model for Recommendation

被引:2
|
作者
Lin, Zhipeng [1 ]
Yang, Wenjing [1 ]
Zhang, Yongjun [2 ]
Wang, Haotian [1 ]
Tang, Yuhua [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, State Key Lab High Performance Comp, Changsha, Hunan, Peoples R China
[2] Natl Innovat Inst Def Technol, Beijing, Peoples R China
基金
美国国家科学基金会;
关键词
Recommender systems; Attention-based mechanism; Convolutional neural network; Factorization machine;
D O I
10.1007/978-3-030-04179-3_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is common nowadays for online buyers to rate shopping items and write review text. This review text information has been proven to be very useful in understanding user preferences and item properties, and thus enhances the capability of Recommender Systems (RS). However, the usefulness of reviews and the significance of words in each review are varied. In this paper, we introduce a multi-level attention mechanism to explore the usefulness of reviews and the significance of words and propose a Multi-level Attention-based Model (MulAttRec) for the recommendation. In addition, we introduce a hybrid prediction layer that model the non-linear interaction between users and items by coupling Factorization Machine (FM) to Deep Neural Network (DNN), which emphasizes both low-order and high-order feature interaction. Extensive experiments show that our approach is able to provide more accurate recommendations than the state-of-the-art recommendation approaches including PMF, NMF, LDA, DeepCoNN, and NARRE. Furthermore, the visualization and analysis of keyword and useful reviews validate the reasonability of our multi-level attention mechanism.
引用
收藏
页码:240 / 252
页数:13
相关论文
共 50 条
  • [1] Multi-level Attention-based Domain Disentanglement for BCDR
    Zhang, Xinyue
    Li, Jingjing
    Su, Hongzu
    Zhu, Lei
    Shen, Heng Tao
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (04)
  • [2] Next Basket Recommendation Model Based on Attribute-Aware Multi-Level Attention
    Liu, Tong
    Yin, Xianrui
    Ni, Weijian
    IEEE ACCESS, 2020, 8 : 153872 - 153880
  • [3] Attention-based Multi-level Feature Fusion for Named Entity Recognition
    Yang, Zhiwei
    Chen, Hechang
    Zhang, Jiawei
    Ma, Jing
    Chang, Yi
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3594 - 3600
  • [4] Joint Deep Model with Multi-Level Attention and Hybrid-Prediction for Recommendation
    Lin, Zhipeng
    Tang, Yuhua
    Zhang, Yongjun
    ENTROPY, 2019, 21 (02):
  • [5] Attention-based interactive multi-level feature fusion for named entity recognition
    Xu, Yiwu
    Chen, Yun
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [6] Facial image inpainting using attention-based multi-level generative network
    Liu, Jie
    Jung, Cheolkon
    NEUROCOMPUTING, 2021, 437 : 95 - 106
  • [7] Attention-based Multi-Level Fusion Network for Light Field Depth Estimation
    Chen, Jiaxin
    Zhang, Shuo
    Lin, Youfang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 1009 - 1017
  • [8] Residual Attention-Based Image Fusion Method with Multi-Level Feature Encoding
    Li, Hao
    Yang, Tiantian
    Wang, Runxiang
    Li, Cuichun
    Zhou, Shuyu
    Guo, Xiqing
    SENSORS, 2025, 25 (03)
  • [9] Uncovering visual attention-based multi-level tampering traces for face forgery detection
    Yadav, Ankit
    Gupta, Dhruv
    Vishwakarma, Dinesh Kumar
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (02) : 1259 - 1272
  • [10] Uncovering visual attention-based multi-level tampering traces for face forgery detection
    Ankit Yadav
    Dhruv Gupta
    Dinesh Kumar Vishwakarma
    Signal, Image and Video Processing, 2024, 18 : 1259 - 1272