Parameter-Free Loss for Class-Imbalanced Deep Learning in Image Classification

被引:32
作者
Du, Jie [1 ,2 ,3 ]
Zhou, Yanhong [1 ,2 ,3 ]
Liu, Peng [4 ]
Vong, Chi-Man [4 ]
Wang, Tianfu [1 ,2 ,3 ]
机构
[1] Shenzhen Univ, Sch Biomed Engn, Hlth Sci Ctr, Shenzhen 518060, Peoples R China
[2] Shenzhen Univ, Natl Reg Key Technol Engn Lab Med Ultrasound, Shenzhen 518060, Peoples R China
[3] Shenzhen Univ, Marshall Lab Biomed Engn, Shenzhen 518060, Peoples R China
[4] Univ Macau, Dept Comp & Informat Sci, Macau 999078, Peoples R China
基金
中国国家自然科学基金;
关键词
Training; Tuning; Deep learning; Data models; Task analysis; Learning systems; Focusing; Class-imbalanced deep learning; dynamic changes; hyperparameters tuning; loss function; parameter-free;
D O I
10.1109/TNNLS.2021.3110885
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Current state-of-the-art class-imbalanced loss functions for deep models require exhaustive tuning on hyperparameters for high model performance, resulting in low training efficiency and impracticality for nonexpert users. To tackle this issue, a parameter-free loss (PF-loss) function is proposed, which works for both binary and multiclass-imbalanced deep learning for image classification tasks. PF-loss provides three advantages: 1) training time is significantly reduced due to NO tuning on hyperparameter(s); 2) it dynamically pays more attention on minority classes (rather than outliers compared to the existing loss functions) with NO hyperparameters in the loss function; and 3) higher accuracy can be achieved since it adapts to the changes of data distribution in each mini-batch instead of the fixed hyperparameters in the existing methods during training, especially when the data are highly skewed. Experimental results on some classical image datasets with different imbalance ratios (IR, up to 200) show that PF-loss reduces the training time down to 1/148 of that spent by compared state-of-the-art losses and simultaneously achieves comparable or even higher accuracy in terms of both G-mean and area under receiver operating characteristic (ROC) curve (AUC) metrics, especially when the data are highly skewed.
引用
收藏
页码:3234 / 3240
页数:7
相关论文
共 32 条
[1]   Distribution Cognisant Loss for Cross-Database Facial Age Estimation With Sensitivity Analysis [J].
Akbari, Ali ;
Awais, Muhammad ;
Feng, Zhenhua ;
Farooq, Ammarah ;
Kittler, Josef .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (04) :1869-1887
[2]   Biased Random Forest For Dealing With the Class Imbalance Problem [J].
Bader-El-Den, Mohammed ;
Teitei, Eleman ;
Perry, Todd .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (07) :2163-2172
[3]  
Cao KD, 2019, ADV NEUR IN, V32
[4]   Towards Accurate One-Stage Object Detection with AP-Loss [J].
Chen, Kean ;
Li, Jianguo ;
Lin, Weiyao ;
See, John ;
Wang, Ji ;
Duan, Lingyu ;
Chen, Zhibo ;
He, Changwei ;
Zou, Junni .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :5114-5122
[5]   Class-Balanced Loss Based on Effective Number of Samples [J].
Cui, Yin ;
Jia, Menglin ;
Lin, Tsung-Yi ;
Song, Yang ;
Belongie, Serge .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :9260-9269
[6]  
Dumpala SH, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2100
[7]  
Graves A, 2013, INT CONF ACOUST SPEE, P6645, DOI 10.1109/ICASSP.2013.6638947
[8]   Learning from Imbalanced Data [J].
He, Haibo ;
Garcia, Edwardo A. .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2009, 21 (09) :1263-1284
[9]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[10]   Deep Imbalanced Learning for Face Recognition and Attribute Prediction [J].
Huang, Chen ;
Li, Yining ;
Loy, Chen Change ;
Tang, Xiaoou .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (11) :2781-2794