Long-tailed classification based on dynamic class average loss

被引:0
作者
Lee, Do Ryun [1 ]
Kim, Chang Ouk [1 ]
机构
[1] Yonsei Univ, Dept Ind Engn, 50 Yonsei Ro, Seoul 03722, South Korea
基金
新加坡国家研究基金会;
关键词
Long-tailed classification; Deep learning; Loss function; Class average loss-based weight;
D O I
10.1016/j.eswa.2025.128292
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In real-world data distributions, class imbalance is a common issue. When training deep learning models on classimbalanced data, the performance of classes with fewer samples tends to deteriorate. Numerous studies have addressed this problem, focusing on loss reweighting techniques based on the number of training samples per class. However, because some classes are inherently easier or harder to classify, having a larger number of samples in a particular class does not necessarily ensure lower loss or better learning for that class. Additionally, if the ratio of loss magnitudes differs substantially from the ratio of the number of training samples per class, reweighting based solely on sample size may be inappropriate. This study proposes a method to reweight losses based on dynamic class average losses rather than the number of training samples per class to address these issues. Specifically, this method evaluates the class average losses for each mini-batch, applies a nonlinear transformation to these values, and dynamically adjusts the class-wise loss weights within the loss function during training to better mitigate class imbalance. Experimental results from various types of datasets, including image and tabular data, demonstrate that the proposed method improves performance by 1%-8% across various datasets compared to existing methods.
引用
收藏
页数:10
相关论文
共 27 条
[1]   Learning from Imbalanced Data Sets with Weighted Cross-Entropy Function [J].
Aurelio, Yuri Sousa ;
de Almeida, Gustavo Matheus ;
de Castro, Cristiano Leite ;
Braga, Antonio Padua .
NEURAL PROCESSING LETTERS, 2019, 50 (02) :1937-1949
[2]  
Cao KD, 2019, ADV NEUR IN, V32
[3]   Transfer Knowledge from Head to Tail: Uncertainty Calibration under Long-tailed Distribution [J].
Chen, Jiahao ;
Su, Bing .
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, :19978-19987
[4]   Multi-Label Meta Weighting for Long-Tailed Dynamic Scene Graph Generation [J].
Chen, Shuo ;
Du, Yingjun ;
Mettes, Pascal ;
Snoek, Cees G. M. .
PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, :39-47
[5]   Class-Balanced Loss Based on Effective Number of Samples [J].
Cui, Yin ;
Jia, Menglin ;
Lin, Tsung-Yi ;
Song, Yang ;
Belongie, Serge .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :9260-9269
[6]  
Di SN, 2022, Arxiv, DOI [arXiv:2208.14008, DOI 10.48550/ARXIV.2208.14008]
[7]  
Dong BW, 2022, Arxiv, DOI arXiv:2210.01033
[8]   Dynamically Weighted Balanced Loss: Class Imbalanced Learning and Confidence Calibration of Deep Neural Networks [J].
Fernando, K. Ruwani M. ;
Tsokos, Chris P. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) :2940-2951
[9]   Linear-exponential loss incorporated deep learning for imbalanced classification [J].
Fu, Saiji ;
Su, Duo ;
Li, Shilin ;
Sun, Shiding ;
Tian, Yingjie .
ISA TRANSACTIONS, 2023, 140 :279-292
[10]   Long-Tailed Visual Recognition via Self-Heterogeneous Integration with Knowledge Excavation [J].
Jin, Yan ;
Li, Mengke ;
Lu, Yang ;
Cheung, Yiu-ming ;
Wang, Hanzi .
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, :23695-23704