A Dual-branch Learning Model with Gradient-balanced Loss for Long-tailed Multi-label Text Classification

被引:2
|
作者
Yao, Yitong [1 ]
Zhang, Jing [1 ]
Zhang, Peng [1 ]
Sun, Yueheng [1 ]
机构
[1] Tianjin Univ, Tianjin, Peoples R China
关键词
Multi-label text classification; long-tailed learning; dual-branch structure; re-weighting loss function;
D O I
10.1145/3597416
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-label text classification has awide range of applications in the realworld. However, the data distribution in the real world is often imbalanced, which leads to serious long-tailed problems. For multi-label classification, due to the vast scale of datasets and existence of label co-occurrence, how to effectively improve the prediction accuracy of tail labels without degrading the overall precision becomes an important challenge. To address this issue, we propose A Dual-Branch Learning Model with Gradient-Balanced Loss (DBGB) based on the paradigm of existing pre-trained multi-label classification SOTA models. Our model consists of two main long-tailed module improvements. First, with the shared text representation, the dual-classifier is leveraged to process two kinds of label distributions; one is the original data distribution and the other is the under-sampling distribution for head labels to strengthen the prediction for tail labels. Second, the proposed gradient-balanced loss can adaptively suppress the negative gradient accumulation problem related to labels, especially tail labels. We perform extensive experiments on three multi-label text classification datasets. The results show that the proposed method achieves competitive performance on overall prediction results compared to the state-of-the-art methods in solving the multi-label classification, with significant improvement on tail-label accuracy.
引用
收藏
页数:24
相关论文
共 43 条
  • [1] Exploring Contrastive Learning for Long-Tailed Multi-label Text Classification
    Audibert, Alexandre
    Gauffre, Aurelien
    Amini, Massih-Reza
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT VII, ECML PKDD 2024, 2024, 14947 : 245 - 261
  • [2] Residual diverse ensemble for long-tailed multi-label text classification
    Shi, Jiangxin
    Wei, Tong
    Li, Yufeng
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (11)
  • [3] Distributionally Robust Loss for Long-Tailed Multi-label Image Classification
    Lin, Dekun
    Peng, Tailai
    Chen, Rui
    Xie, Xinran
    Qin, Xiaolin
    Cui, Zhe
    COMPUTER VISION - ECCV 2024, PT XXXIII, 2025, 15091 : 417 - 433
  • [4] Triple Alliance Prototype Orthotist Network for Long-Tailed Multi-Label Text Classification
    Xiao, Lin
    Xu, Pengyu
    Song, Mingyang
    Liu, Huafeng
    Jing, Liping
    Zhang, Xiangliang
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 2616 - 2628
  • [5] Contrastive dual-branch network for long-tailed visual recognition
    Miao, Jie
    Zhai, Junhai
    Han, Ling
    PATTERN ANALYSIS AND APPLICATIONS, 2025, 28 (01)
  • [6] Balanced Gradient Penalty Improves Deep Long-Tailed Learning
    Wang, Dong
    Liu, Yicheng
    Fang, Liangji
    Shang, Fanhua
    Liu, Yuanyuan
    Liu, Hongying
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 5093 - 5101
  • [7] A Multi-Label Text Classification Model with Enhanced Label Information
    Wang, Min
    Gao, Yan
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 329 - 334
  • [8] Contrastive Enhanced Learning for Multi-Label Text Classification
    Wu, Tianxiang
    Yang, Shuqun
    APPLIED SCIENCES-BASEL, 2024, 14 (19):
  • [9] Hierarchical contrastive learning for multi-label text classification
    Wei Zhang
    Yun Jiang
    Yun Fang
    Shuai Pan
    Scientific Reports, 15 (1)
  • [10] Variational Continuous Label Distribution Learning for Multi-Label Text Classification
    Zhao, Xingyu
    An, Yuexuan
    Xu, Ning
    Geng, Xin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (06) : 2716 - 2729