Distilling Virtual Examples for Long-tailed Recognition

被引:57
作者
He, Yin-Yin [1 ]
Wu, Jianxin [1 ]
Wei, Xiu-Shen [1 ,2 ]
机构
[1] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing, Peoples R China
来源
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021) | 2021年
基金
中国国家自然科学基金;
关键词
SMOTE;
D O I
10.1109/ICCV48922.2021.00030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We tackle the long-tailed visual recognition problem from the knowledge distillation perspective by proposing a Distill the Virtual Examples (DiVE) method. Specifically, by treating the predictions of a teacher model as virtual examples, we prove that distilling from these virtual examples is equivalent to label distribution learning under certain constraints. We show that when the virtual example distribution becomes flatter than the original input distribution, the under-represented tail classes will receive significant improvements, which is crucial in long-tailed recognition. The proposed DiVE method can explicitly tune the virtual example distribution to become flat. Extensive experiments on three benchmark datasets, including the large-scale iNaturalist ones, justify that the proposed DiVE method can significantly outperform state-of-the-art methods. Furthermore, additional analyses and experiments verify the virtual example interpretation, and demonstrate the effectiveness of tailored designs in DiVE for long-tailed problems.
引用
收藏
页码:235 / 244
页数:10
相关论文
共 34 条
  • [31] RETRACTED: Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution (Retracted Article)
    Hu, Hao
    Gao, Mengya
    Wu, Mingsheng
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [32] Coarse-to-fine knowledge transfer based long-tailed classification via bilateral-sampling network
    Xu, Junyan
    Zhao, Wei
    Zhao, Hong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (10) : 3323 - 3336
  • [33] ServeNet-LT: A Normalized Multi-head Deep Neural Network for Long-tailed Web Services Classification
    Zhang, Jing
    Chen, Yang
    Yang, Yilong
    Lei, Changran
    Wang, Deqiang
    2021 IEEE INTERNATIONAL CONFERENCE ON WEB SERVICES, ICWS 2021, 2021, : 97 - 106
  • [34] Active diversification of head-class features in bilateral-expert models for enhanced tail-class optimization in long-tailed classification
    Chen, Jianting
    Ding, Ling
    Yang, Yunxiao
    Xiang, Yang
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126