Distilling Virtual Examples for Long-tailed Recognition

被引:57
|
作者
He, Yin-Yin [1 ]
Wu, Jianxin [1 ]
Wei, Xiu-Shen [1 ,2 ]
机构
[1] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing, Peoples R China
来源
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021) | 2021年
基金
中国国家自然科学基金;
关键词
SMOTE;
D O I
10.1109/ICCV48922.2021.00030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We tackle the long-tailed visual recognition problem from the knowledge distillation perspective by proposing a Distill the Virtual Examples (DiVE) method. Specifically, by treating the predictions of a teacher model as virtual examples, we prove that distilling from these virtual examples is equivalent to label distribution learning under certain constraints. We show that when the virtual example distribution becomes flatter than the original input distribution, the under-represented tail classes will receive significant improvements, which is crucial in long-tailed recognition. The proposed DiVE method can explicitly tune the virtual example distribution to become flat. Extensive experiments on three benchmark datasets, including the large-scale iNaturalist ones, justify that the proposed DiVE method can significantly outperform state-of-the-art methods. Furthermore, additional analyses and experiments verify the virtual example interpretation, and demonstrate the effectiveness of tailored designs in DiVE for long-tailed problems.
引用
收藏
页码:235 / 244
页数:10
相关论文
共 34 条
  • [21] Geometric Prior Guided Feature Representation Learning for Long-Tailed Classification
    Ma, Yanbiao
    Jiao, Licheng
    Liu, Fang
    Yang, Shuyuan
    Liu, Xu
    Chen, Puhua
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2024, 132 (07) : 2493 - 2510
  • [22] Hybrid ResNet based on joint basic and attention modules for long-tailed classification
    Zhao, Wei
    Su, Yuling
    Hu, Minjie
    Zhao, Hong
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2022, 150 : 83 - 97
  • [23] Class Activation Maps-based Feature Augmentation for long-tailed classification
    Niu, Jiawei
    Zhang, Zuowei
    Liu, Zhunga
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [24] Learning Multi-Expert Distribution Calibration for Long-Tailed Video Classification
    Hu, Yufan
    Gao, Junyu
    Xu, Changsheng
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 555 - 567
  • [25] Weight-guided loss for long-tailed object detection and instance segmentation
    Zhao, Xinqiao
    Xiao, Jimin
    Zhang, Bingfeng
    Zhang, Quan
    Waleed, Al-Nuaimy
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2023, 110
  • [26] Feature Distribution Representation Learning Based on Knowledge Transfer for Long-Tailed Classification
    Ma, Yanbiao
    Jiao, Licheng
    Liu, Fang
    Yang, Shuyuan
    Liu, Xu
    Chen, Puhua
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 2772 - 2784
  • [27] Imbalance fault diagnosis under long-tailed distribution: Challenges, solutions and prospects
    Chen, Zhuohang
    Chen, Jinglong
    Feng, Yong
    Liu, Shen
    Zhang, Tianci
    Zhang, Kaiyu
    Xiao, Wenrong
    KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [28] Long-Tailed Traffic Sign Detection Using Attentive Fusion and Hierarchical Group Softmax
    Gao, Erfeng
    Huang, Weiguo
    Shi, Juanjuan
    Wang, Xiang
    Zheng, Jianying
    Du, Guifu
    Tao, Yanyun
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (12) : 24105 - 24115
  • [29] Improving long-tailed classification with PixDyMix: a localized pixel-level mixing method
    Zeng, Wu
    Xiao, Zhengying
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (10) : 7157 - 7170
  • [30] Long-Tailed Effect Study in Remote Sensing Semantic Segmentation Based on Graph Kernel Principles
    Cui, Wei
    Feng, Zhanyun
    Chen, Jiale
    Xu, Xing
    Tian, Yueling
    Zhao, Huilin
    Wang, Chenglei
    REMOTE SENSING, 2024, 16 (08)