Multiple Contrastive Experts for long-tailed image classification

被引:1
作者
Wang, Yandan [1 ]
Sun, Kaiyin [1 ]
Guo, Chenqi [1 ]
Zhong, Shiwei [1 ]
Liu, Huili [1 ]
Ma, Yinglong [1 ]
机构
[1] North China Elect Power Univ, Sch Control & Comp Engn, Beijing 102206, Peoples R China
关键词
Long-tailed image classification; Loosely coupled ensemble model; Multiple contrastive experts; Supervised contrastive learning;
D O I
10.1016/j.eswa.2024.124613
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Real-world image classification data usually exhibits a challenging long-tailed distribution, attributed to the inherent difficulty in data collection. Existing ensemble approaches predominantly prioritize the empirical diversification of the ensemble model, sidelining its critical aspects of representation ability. A noticeable gap also exists in theoretical analysis elucidating the intricate relationship between ensemble model diversity and its generalization efficacy. This paper introduces a loosely coupled ensemble framework, Multiple Contrastive Experts (MCE) tailored for long-tailed image classification, aiming to bolster image representation while ensuring diversity within the proposed MCE. Leveraging skill-diverse classification losses, the experts in the ensemble model are able to specialize in different kinds of classes in the long-tailed distribution. An adapted supervised contrastive learning (SCL) loss is introduced to guide training for each feature learning branch to enhance the representation ability of MCE. Through the effective integration of the different losses from all experts, each expert model can be optimized coordinately. Moreover, the relationship between the generalization ability of MCE and its diversity is theoretically revealed against a single classification model. At last, extensive experiments were made over five widely used long-tailed image classification datasets. The results show that our proposed MCE is very competitive against the state-of-the-art methods, while maintaining a relatively lower computational cost. Besides, MCE has also exhibited superior performance with an increasing number of experts.
引用
收藏
页数:12
相关论文
empty
未找到相关数据