Multi-module Fusion Relevance Attention Network for Multi-label Text Classification

被引:0
作者
Yu, Xinmiao [1 ]
Li, Zhengpeng [1 ]
Wu, Jiansheng [1 ]
Liu, Mingao [1 ]
机构
[1] Univ Sci & Technol Liaoning, Anshan 114051, Peoples R China
基金
中国国家自然科学基金;
关键词
deep learning; neural network; multi-label text classification; attention mechanism;
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
To solve the multi-label text classification (MLTC) task, we propose a multi-module fusion relevance attention network (MFRAN) to explore the semantic correlation between text and category labels. Firstly, the MFRAN model uses a text feature extraction module to capture text information with a strong correlation with category labels and uses multi-head self-attention to obtain the attention score of the corresponding text. Then the learned word-level text semantic information is transmitted to the label attention layer of the category label feature extraction module through multi-dimensional dilated convolution. At the same time, the attention score of category labels is obtained by the bidirectional long short-term memory and label attention layer. The adaptive attention fusion module is used to fuse the text attention score with the attention score of the category label and select the text representation with large output information. We performed a large number of comparative experiments and ablation experiments on the RCV1-V2 and AAPD datasets. The experimental results have proved the MFRAN model is similar to or even exceeds the baseline model when dealing with MLTC tasks.
引用
收藏
页数:9
相关论文
共 36 条
  • [31] Xiao L, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P466
  • [32] Sentiment Analysis of Comment Texts Based on BiLSTM
    Xu, Guixian
    Meng, Yueting
    Qiu, Xiaoyu
    Yu, Ziheng
    Wu, Xu
    [J]. IEEE ACCESS, 2019, 7 : 51522 - 51532
  • [33] Yang PC, 2018, PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, P496
  • [34] Yang PC, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P5252
  • [35] Yang R, 2018, P 27 INT C COMP LING
  • [36] You R., 2019, Advances in Neural Information Processing Systems, V32