Central-Diffused Instance Generation Method in Class Incremental Learning

被引:0
作者
Liu, Mingyu [1 ]
Wang, Yijie [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Sci & Technol Parallel & Distributed Lab, Changsha, Peoples R China
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
基金
国家教育部科学基金资助; 中国国家自然科学基金;
关键词
Machine learning; Class incremental learning; Class imbalance learning; Supervised learning; CLASSIFICATION;
D O I
10.1007/978-3-030-30484-3_37
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class incremental learning is widely applied in the classification scenarios as the number of classes is usually dynamically changing. Meanwhile, class imbalance learning often occurs simultaneously in class incremental learning when the new class emerges. Previous studies mainly proposed different methods to handle this problem. But these methods focus on classification tasks with a fixed class set and cannot adjust the peripheral contour features of the original instance distribution. As a result, the classification performance degrades seriously in an open dynamic environment, and the synthetic instances are always clustered within the original distribution. In order to solve class imbalance learning effectively in class incremental learning, we propose a Central-diffused Instance Generation Method to generate the instances of minority class as the new class emerging, called CdIGM. The key is to randomly shoot direction vectors of fixed length from the center of new class instances to expand the instance distribution space. The vectors diffuse to form a distribution which is optimized to satisfy properties that produce a multi-classification discriminative classifier with good performance. We conduct the experiments on both artificial data streams with different imbalance rates and real-world ones to compare CdIGM with some other proposed methods, e.g. SMOTE, OPCIL, OB and SDCIL. The experiment results show that CdIGM averagely achieves more than 4.01%, 4.49%, 8.81% and 9.76% performance improvement over SMOTE, OPCIL, OB and SDCIL, respectively, and outperforms in terms of overall and real-time accuracy. Our method is proved to possess the strength of class incremental learning and class imbalance learning with good accuracy and robustness.
引用
收藏
页码:453 / 465
页数:13
相关论文
共 23 条
[1]  
[Anonymous], 2014, IEEE T PARALLEL DIST
[2]  
[Anonymous], 2017, IEEE INT C DAT MIN
[3]   SMOTE: Synthetic minority over-sampling technique [J].
Chawla, Nitesh V. ;
Bowyer, Kevin W. ;
Hall, Lawrence O. ;
Kegelmeyer, W. Philip .
2002, American Association for Artificial Intelligence (16)
[4]   Learning Deep Representation for Imbalanced Classification [J].
Huang, Chen ;
Li, Yining ;
Loy, Chen Change ;
Tang, Xiaoou .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :5375-5384
[5]   One-class classification: taxonomy of study and review of techniques [J].
Khan, Shehroz S. ;
Madden, Michael G. .
KNOWLEDGE ENGINEERING REVIEW, 2014, 29 (03) :345-374
[6]   An empirical study of learning from imbalanced data using random forest [J].
Khoshgoftaar, Taghi M. ;
Golawala, Moiz ;
Van Hulse, Jason .
19TH IEEE INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, VOL II, PROCEEDINGS, 2007, :310-317
[7]   From N to N+1: Multiclass Transfer Incremental Learning [J].
Kuzborskij, Ilja ;
Orabona, Francesco ;
Caputo, Barbara .
2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, :3358-3365
[8]  
Lake B. M., 2013, INT C NEUR INF PROC
[9]   One-shot learning of object categories [J].
Li, FF ;
Fergus, R ;
Perona, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2006, 28 (04) :594-611
[10]  
LUO GK, 2018, IEEE T CONTR SYST T, P1, DOI DOI 10.1109/TCST.2018.2864723