Efficient Perturbation Inference and Expandable Network for continual learning

被引:9
|
作者
Du, Fei [1 ]
Yang, Yun [2 ]
Zhao, Ziyuan [3 ]
Zeng, Zeng [3 ,4 ]
机构
[1] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650091, Peoples R China
[2] Yunnan Univ, Natl Pilot Sch Software, Kunming 650091, Peoples R China
[3] ASTAR, Inst Infocomm Res I2R, Singapore 138632, Singapore
[4] Shanghai Univ, Sch Microelect, Shanghai, Peoples R China
关键词
Continual learning; Dynamic networks; Class incremental learning; Uncertainty inference;
D O I
10.1016/j.neunet.2022.10.030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although humans are capable of learning new tasks without forgetting previous ones, most neural networks fail to do so because learning new tasks could override the knowledge acquired from previous data. In this work, we alleviate this issue by proposing a novel Efficient Perturbation Inference and Expandable Network (EPIE-Net), which dynamically expands lightweight task-specific decoders for new classes and utilizes a mixed-label uncertainty strategy to improve the robustness. Moreover, we calculate the average probability of perturbed samples at inference, which can generally improve the performance of the model. Experimental results show that our method consistently outperforms other methods with fewer parameters in class incremental learning benchmarks. For example, on the CIFAR100 10 steps setup, our method achieves an average accuracy of 76.33% and the last accuracy of 65.93% within only 3.46M average parameters.(c) 2022 Published by Elsevier Ltd.
引用
收藏
页码:97 / 106
页数:10
相关论文
共 50 条
  • [31] CLiCK: Continual Learning Exploiting Intermediate Network Models With A Slack Class
    Kim, Hyejin
    Yoon, Seunghyun
    Lim, Hyuk
    IEEE ACCESS, 2023, 11 : 104224 - 104233
  • [32] CLELNet: A continual learning network for esophageal lesion analysis on endoscopic images
    Tang, Suigu
    Yu, Xiaoyuan
    Cheang, Chak Fong
    Ji, Xiaoyu
    Yu, Hon Ho
    Choi, I. Cheong
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2023, 231
  • [33] A self-organizing incremental neural network for continual supervised learning
    Wiwatcharakoses, Chayut
    Berrar, Daniel
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 185 (185)
  • [34] JCBIE: a joint continual learning neural network for biomedical information extraction
    He, Kai
    Mao, Rui
    Gong, Tieliang
    Cambria, Erik
    Li, Chen
    BMC BIOINFORMATICS, 2022, 23 (01)
  • [35] GopGAN: Gradients Orthogonal Projection Generative Adversarial Network With Continual Learning
    Li, Xiaobin
    Wang, Weiqiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (01) : 215 - 227
  • [36] Logarithmic Continual Learning
    Masarczyk, Wojciech
    Wawrzynski, Pawel
    Marczak, Daniel
    Deja, Kamil
    Trzcinski, Tomasz
    IEEE ACCESS, 2022, 10 : 117001 - 117010
  • [37] Selecting Related Knowledge via Efficient Channel Attention for Online Continual Learning
    Han, Ya-nan
    Liu, Jian-we
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [38] POSTER: Advancing Federated Edge Computing with Continual Learning for Secure and Efficient Performance
    Chen, Chunlu
    Wang, Kevin I-Kai
    Li, Peng
    Sakurai, Kouichi
    APPLIED CRYPTOGRAPHY AND NETWORK SECURITY WORKSHOPS, ACNS 2023 SATELLITE WORKSHOPS, ADSC 2023, AIBLOCK 2023, AIHWS 2023, AIOTS 2023, CIMSS 2023, CLOUD S&P 2023, SCI 2023, SECMT 2023, SIMLA 2023, 2023, 13907 : 685 - 689
  • [39] Efficient Data-Parallel Continual Learning with Asynchronous Distributed Rehearsal Buffers
    Bouvier, Thomas
    Nicolae, Bogdan
    Chaugier, Hugo
    Costan, Alexandru
    Foster, Ian
    Antoniu, Gabriel
    2024 IEEE 24TH INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING, CCGRID 2024, 2024, : 245 - 254
  • [40] SpaceNet: Make Free Space for Continual Learning
    Sokar, Ghada
    Mocanu, Decebal Constantin
    Pechenizkiy, Mykola
    NEUROCOMPUTING, 2021, 439 : 1 - 11