Augmenting Features via Contrastive Learning-based Generative Model for Long-Tailed Classification

被引:0
作者
Park, Minho [1 ]
Kim, Hyung-Il [1 ]
Song, Hwa Jeon [1 ]
Kang, Dong-Oh [1 ]
机构
[1] Elect & Telecommun Res Inst ETRI, Daejeon, South Korea
来源
2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW | 2023年
关键词
D O I
10.1109/ICCVW60793.2023.00108
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Thanks to the advances in deep learning-based computer vision, image classification has shown great achievements. However, it has faced a heavy class imbalance issue which is one of the characteristics of real-world datasets. The severe class imbalance makes the classifier easily biased toward majority classes and overfitting to minority classes. To address this issue, supplementing minority classes with artificially generated samples has proven effective. In addition, contrastive learning has been introduced to improve image classification performance recently. Motivated by recent works, we propose feature augmentation via a contrastive learning-based generative model for long-tailed classification. Specifically, features are augmented using the feature dictionary obtained by real samples and the generated convex weights, which are used for learning an image classification model. Here, the model for the feature augmentation is trained based on generative adversarial learning and contrastive learning in an end-to-end manner. The generative adversarial learning helps to generate real-like features, and the contrastive learning improves the feature's discrimination power. Through extensive experiments with various long-tailed classification datasets, we verify the effectiveness of the proposed method.
引用
收藏
页码:1010 / 1019
页数:10
相关论文
共 50 条
[1]  
[Anonymous], 2003, SMOTEBoost: improving prediction of the minority class in boosting
[2]  
[Anonymous], 2017, ARXIV170807747
[3]   A Survey of Predictive Modeling on Im balanced Domains [J].
Branco, Paula ;
Torgo, Luis ;
Ribeiro, Rita P. .
ACM COMPUTING SURVEYS, 2016, 49 (02)
[4]   A systematic study of the class imbalance problem in convolutional neural networks [J].
Buda, Mateusz ;
Maki, Atsuto ;
Mazurowski, Maciej A. .
NEURAL NETWORKS, 2018, 106 :249-259
[5]  
Cao KD, 2019, ADV NEUR IN, V32
[6]   SMOTE: Synthetic minority over-sampling technique [J].
Chawla, Nitesh V. ;
Bowyer, Kevin W. ;
Hall, Lawrence O. ;
Kegelmeyer, W. Philip .
2002, American Association for Artificial Intelligence (16)
[7]  
Chen T., 2020, P INT C MACH LEARN, P1597
[8]   AutoAugment: Learning Augmentation Strategies from Data [J].
Cubuk, Ekin D. ;
Zoph, Barret ;
Mane, Dandelion ;
Vasudevan, Vijay ;
Le, Quoc V. .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :113-123
[9]   Parametric Contrastive Learning [J].
Cui, Jiequan ;
Zhong, Zhisheng ;
Liu, Shu ;
Yu, Bei ;
Jia, Jiaya .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :695-704
[10]   Class-Balanced Loss Based on Effective Number of Samples [J].
Cui, Yin ;
Jia, Menglin ;
Lin, Tsung-Yi ;
Song, Yang ;
Belongie, Serge .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :9260-9269