Balanced complement loss for long-tailed image classification

被引:0
作者
Hu, Luyu [1 ,2 ]
Yang, Zhao [1 ,2 ]
Dou, Yamei [1 ,2 ]
Li, Jiahao [1 ,2 ]
机构
[1] Guangzhou Univ, Sch Elect & Commun Engn, Guangzhou 510006, Peoples R China
[2] Guangzhou Univ, Huangpu Res & Grad Sch, Guangzhou 510555, Peoples R China
关键词
Long-tailed distribution; Class imbalance; Complement classes learning; Image classificationIntroduction;
D O I
10.1007/s11042-023-17583-0
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Long-tailed image classification is a significant research direction in the computer vision community, since many real-world image datasets exhibit an obvious long-tailed distribution. For a long-tailed dataset, head classes have most training samples, while tail classes occupy few training samples. In this scenario, samples of the head classes are treated as complement samples (all samples except the ground-truth class samples) of the tail classes, which generate overwhelmingly discouraging gradients on the tail classes. Consequently, the samples of tail classes are easily misclassified as the head classes. In view of this issue, we present a balanced complement (BACL) loss by introducing an adaptive weight for complement classes in the softmax cross-entropy loss. The adaptive weight helps to mitigate the overwhelming suppression of gradients from the complement samples for the tail classes, thereby balancing the gradient ratios between complement classes and ground-truth class. After that, we further propose a joint training framework by combining our method with normalized complement entropy (NCE) via a novel double-angle sine decay strategy. The proposed decay strategy is applied to adjust the contribution between the BACL and NCE losses at the different training stages. With our joint training framework, the model first learns useful information from the complement samples and then gradually turns its attention to the classification task. Experiments on long-tailed versions of CIFAR-10/100, SVHN and ImageNet-2012 datasets are conducted to reveal the significant effectiveness of the proposed methods.
引用
收藏
页码:52989 / 53007
页数:19
相关论文
共 47 条
  • [11] UniformFace: Learning Deep Equidistributed Representation for Face Recognition
    Duan, Yueqi
    Lu, Jiwen
    Zhou, Jie
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 3410 - 3419
  • [12] Exploring Classification Equilibrium in Long-Tailed Object Detection
    Feng, Chengjian
    Zhong, Yujie
    Huang, Weilin
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 3397 - 3406
  • [13] Dynamically Weighted Balanced Loss: Class Imbalanced Learning and Confidence Calibration of Deep Neural Networks
    Fernando, K. Ruwani M.
    Tsokos, Chris P.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 2940 - 2951
  • [14] Learning from Imbalanced Data
    He, Haibo
    Garcia, Edwardo A.
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2009, 21 (09) : 1263 - 1284
  • [15] Deep Residual Learning for Image Recognition
    He, Kaiming
    Zhang, Xiangyu
    Ren, Shaoqing
    Sun, Jian
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 770 - 778
  • [16] Hsin-Ping Chou, 2020, Computer Vision - ECCV 2020 Workshops. Proceedings. Lecture Notes in Computer Science (LNCS 12540), P95, DOI 10.1007/978-3-030-65414-6_9
  • [17] Deep Imbalanced Learning for Face Recognition and Attribute Prediction
    Huang, Chen
    Li, Yining
    Loy, Chen Change
    Tang, Xiaoou
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (11) : 2781 - 2794
  • [18] Learning Deep Representation for Imbalanced Classification
    Huang, Chen
    Li, Yining
    Loy, Chen Change
    Tang, Xiaoou
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 5375 - 5384
  • [19] Jamal MA, 2020, PROC CVPR IEEE, P7607, DOI 10.1109/CVPR42600.2020.00763
  • [20] Kang B., 2020, INT C LEARN REPR