An R-Transformer_BiLSTM Model Based on Attention for Multi-label Text Classification

被引:13
|
作者
Yan, Yaoyao [1 ]
Liu, Fang'ai [1 ]
Zhuang, Xuqiang [1 ]
Ju, Jie [1 ]
机构
[1] Shandong Normal Univ, Sch Informat Sci & Engn, Jinan 250358, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-label text classification; R-Transformer; Label embedding; Self-attention; BiLSTM;
D O I
10.1007/s11063-022-10938-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label text classification task is one of the research hotspots in the field of natural language processing. However, most of the existing multi-label text classification models are only suitable for scenarios with a small number of labels and coarser granularity. Aiming at the problem of difficulty in obtaining sequence information and obvious lack of semantic information when the text sequence grows, this paper proposes an R-Transformer_BiLSTM model based on label embedding and attention mechanism for multi-label text classification. First, we use the R-Transformer model to obtain the global and local information of the text sequence in combination with part-of-speech embedding. At the same time, we use BiLSTM+CRF to obtain the entity information of the text, and use the self-attention mechanism to obtain the keywords of the entity information, and then use bidirectional attention and label embedding to further generate text representation and label representation. Finally, the classifier performs text classification according to the label representation and text representation. In order to evaluate the performance of the model, we conducted a lot of experiments on the RCV1-V2 and AAPD datasets. Experimental results show that the model can effectively improve the efficiency and accuracy of multi-label text classification task.
引用
收藏
页码:1293 / 1316
页数:24
相关论文
共 50 条
  • [1] An R-Transformer_BiLSTM Model Based on Attention for Multi-label Text Classification
    Yaoyao Yan
    Fang’ai Liu
    Xuqiang Zhuang
    Jie Ju
    Neural Processing Letters, 2023, 55 : 1293 - 1316
  • [2] Multi-label legal text classification with BiLSTM and attention
    Enamoto, Liriam
    Santos, Andre R. A. S.
    Maia, Ricardo
    Weigang, Li
    Rocha Filho, Geraldo P.
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2022, 68 (04) : 369 - 378
  • [3] Multi-label Text Classification Model Combining BiLSTM and Hypergraph Attention
    Wang, Xing
    Hu, HuiTing
    Zhu, GuoHua
    2024 4TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE, CCAI 2024, 2024, : 344 - 349
  • [4] Multi-Label Text Classification model integrating Label Attention and Historical Attention
    Sun, Guoying
    Cheng, Yanan
    Dong, Fangzhou
    Wang, Luhua
    Zhao, Dong
    Zhang, Zhaoxin
    Tong, Xiaojun
    KNOWLEDGE-BASED SYSTEMS, 2024, 296
  • [5] All is attention for multi-label text classification
    Liu, Zhi
    Huang, Yunjie
    Xia, Xincheng
    Zhang, Yihao
    KNOWLEDGE AND INFORMATION SYSTEMS, 2025, 67 (02) : 1249 - 1270
  • [6] Research of multi-label text classification based on label attention and correlation networks
    Yuan, Ling
    Xu, Xinyi
    Sun, Ping
    Yu, Hai ping
    Wei, Yin Zhen
    Zhou, Jun jie
    PLOS ONE, 2024, 19 (09):
  • [7] Multi-Label Text Classification Model Based on Multi-Level Constraint Augmentation and Label Association Attention
    Wei, Xiao
    Huang, Jianbao
    Zhao, Rui
    Yu, Hang
    Xu, Zheng
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (01)
  • [8] Multi-label text classification based on the label correlation mixture model
    He, Zhiyang
    Wu, Ji
    Lv, Ping
    INTELLIGENT DATA ANALYSIS, 2017, 21 (06) : 1371 - 1392
  • [9] Label-text bi-attention capsule networks model for multi-label text classification
    Wang, Gang
    Du, Yajun
    Jiang, Yurui
    Liu, Jia
    Li, Xianyong
    Chen, Xiaoliang
    Gao, Hongmei
    Xie, Chunzhi
    Lee, Yan-li
    NEUROCOMPUTING, 2024, 588
  • [10] Multi-label text classification model based on semantic embedding
    Yan Danfeng
    Ke Nan
    Gu Chao
    Cui Jianfei
    Ding Yiqi
    The Journal of China Universities of Posts and Telecommunications, 2019, 26 (01) : 95 - 104