Class Incremental Learning based on Identically Distributed Parallel One-Class Classifiers

被引:3
作者
Sun, Wenju [1 ,2 ]
Li, Qingyong [1 ,2 ]
Zhang, Jing [1 ,2 ]
Wang, Wen [1 ,2 ]
Geng, YangLi-ao [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Key Lab Big Data & Artificial Intelligence Transpo, Minist Educ, Beijing 100044, Peoples R China
[2] Beijing Jiaotong Univ, Sch Comp & lnformat Technol, Beijing 100044, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Incremental learning; Continual learning; Lifelong learning; One-class learning; Image classification; SUPPORT;
D O I
10.1016/j.neucom.2023.126579
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class incremental learning requires models to learn new-class knowledge without forgetting old-class in-formation. As a natural solution, the parallel one-class framework (POC) has attracted extensive attention. However, POC is prone to suffer the problem of lacking comparability between classifiers due to their inconsistent output distributions. To address this drawback, we propose an incremental learning method based on Identically Distributed Parallel One-class Classifiers (IDPOC). The core of IDPOC is a novel one-class classifier with Gaussian distributed output, referred to as Deep-SVD2D. Deep-SVD2D encourages the distribution of sample representations to follow the standard multivariate Gaussian. Consequently, the distance between the representation and its class center will approximately follow a chi-square distribution with some freedom degree. IDPOC further eliminates the freedom degree to ensure the output of all classifiers to follow an identical distribution, thus enhancing the comparability between different classifiers. We evaluate IDPOC on four popular benchmarks: MNIST, CIFAR10, CIFAR100, and Tiny-ImageNet. The experimental results show that IDPOC achieves state-of-the-art performance, e.g., it outperforms the best baseline by 1.6% and 2.8% on two large-scale benchmarks of CIFAR100 and Tiny-ImageNet, respectively 1.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Multilayer one-class extreme learning machine
    Dai, Haozhen
    Cao, Jiuwen
    Wang, Tianlei
    Deng, Muqing
    Yang, Zhixin
    NEURAL NETWORKS, 2019, 115 (11-22) : 11 - 22
  • [32] Visual Object Detection Using Cascades of Binary and One-Class Classifiers
    Cevikalp, Hakan
    Triggs, Bill
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2017, 123 (03) : 334 - 349
  • [33] Interactive definition and tuning of One-Class classifiers for Document Image Classification
    Girard, Nathalie
    Trullo, Roger
    Barrat, Sabine
    Ragot, Nicolas
    Ramel, Jean-Yves
    PROCEEDINGS OF 12TH IAPR WORKSHOP ON DOCUMENT ANALYSIS SYSTEMS, (DAS 2016), 2016, : 358 - 363
  • [34] A novel incremental one-class support vector machine based on low variance direction
    Kefi-Fatteh, Takoua
    Ksantini, Riadh
    Kaaniche, Mohamed-Becha
    Bouhoula, Adel
    PATTERN RECOGNITION, 2019, 91 : 308 - 321
  • [35] Incremental weighted one-class classifier for mining stationary data streams
    Krawczyk, Bartosz
    Wozniak, Michal
    JOURNAL OF COMPUTATIONAL SCIENCE, 2015, 9 : 19 - 25
  • [36] One-Class Convex Hull-Based Algorithm for Classification in Distributed Environments
    Fernandez-Francos, Diego
    Fontenla-Romero, Oscar
    Alonso-Betanzos, Amparo
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2020, 50 (02): : 386 - 396
  • [37] One-class learning and concept summarization for data streams
    Xingquan Zhu
    Wei Ding
    Philip S. Yu
    Chengqi Zhang
    Knowledge and Information Systems, 2011, 28 : 523 - 553
  • [38] MAHALANOBIS-BASED ONE-CLASS CLASSIFICATION
    Nader, Patric
    Honeine, Paul
    Beauseroy, Pierre
    2014 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2014,
  • [39] One-Class Learning Time-Series Shapelets
    Yamaguchi, Akihiro
    Nishikawa, Takeichiro
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 2365 - 2372
  • [40] Multi-task Learning for One-class Classification
    Yang, Haiqin
    King, Irwin
    Lyu, Michael R.
    2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,