Towards Effective Collaborative Learning in Long-Tailed Recognition

被引:5
作者
Xu, Zhengzhuo [1 ,2 ]
Chai, Zenghao [3 ]
Xu, Chengyin [2 ]
Yuan, Chun [2 ]
Yang, Haiqin [1 ]
机构
[1] Int Digital Econ Acad, Shenzhen 518045, Peoples R China
[2] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Shenzhen 518055, Peoples R China
[3] Natl Univ Singapore, Singapore 119077, Singapore
关键词
Tail; Federated learning; Task analysis; Uncertainty; Training; Head; Feature extraction; Image classification; long tail recognition; collaborative learning; knowledge distillation;
D O I
10.1109/TMM.2023.3314980
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Real-world data usually suffers from severe class imbalance and long-tailed distributions, where minority classes are significantly underrepresented compared to the majority ones. Recent research prefers to utilize multi-expert architectures to mitigate the model uncertainty on the minority, where collaborative learning is employed to aggregate the knowledge of experts, i.e., online distillation. In this article, we observe that the knowledge transfer between experts is imbalanced in terms of class distribution, which results in limited performance improvement of the minority classes. To address it, we propose a re-weighted distillation loss by comparing two classifiers' predictions, which are supervised by online distillation and label annotations, respectively. We also emphasize that feature-level distillation will significantly improve model performance and increase feature robustness. Finally, we propose an Effective Collaborative Learning (ECL) framework that integrates a contrastive proxy task branch to further improve feature quality. Quantitative and qualitative experiments on four standard datasets demonstrate that ECL achieves state-of-the-art performance and the detailed ablation studies manifest the effectiveness of each component in ECL.
引用
收藏
页码:3754 / 3764
页数:11
相关论文
共 74 条
[1]   Long-Tailed Recognition via Weight Balancing [J].
Alshammari, Shaden ;
Wang, Yu-Xiong ;
Ramanan, Deva ;
Kong, Shu .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, :6887-6897
[2]  
Ashukha A., 2020, PROC INT C LEARN REP
[3]   ACE: Ally Complementary Experts for Solving Long-Tailed Recognition in One-Shot [J].
Cai, Jiarui ;
Wang, Yizhou ;
Hwang, Jenq-Neng .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :112-121
[4]  
Cao K., 2019, PROC ANN C NEURAL IN
[5]  
Chen T, 2020, PR MACH LEARN RES, V119
[6]   Scale-aware Automatic Augmentation for Object Detection [J].
Chen, Yukang ;
Li, Yanwei ;
Kong, Tao ;
Qi, Lu ;
Chu, Ruihang ;
Li, Lei ;
Jia, Jiaya .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :9558-9567
[7]   Feature Space Augmentation for Long-Tailed Data [J].
Chu, Peng ;
Bian, Xiao ;
Liu, Shaopeng ;
Ling, Haibin .
COMPUTER VISION - ECCV 2020, PT XXIX, 2020, 12374 :694-710
[8]   Randaugment: Practical automated data augmentation with a reduced search space [J].
Cubuk, Ekin D. ;
Zoph, Barret ;
Shlens, Jonathon ;
Le, Quoc, V .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, :3008-3017
[9]   Parametric Contrastive Learning [J].
Cui, Jiequan ;
Zhong, Zhisheng ;
Liu, Shu ;
Yu, Bei ;
Jia, Jiaya .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :695-704
[10]   Class-Balanced Loss Based on Effective Number of Samples [J].
Cui, Yin ;
Jia, Menglin ;
Lin, Tsung-Yi ;
Song, Yang ;
Belongie, Serge .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :9260-9269