Efficient Perturbation Inference and Expandable Network for continual learning

被引:9
|
作者
Du, Fei [1 ]
Yang, Yun [2 ]
Zhao, Ziyuan [3 ]
Zeng, Zeng [3 ,4 ]
机构
[1] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650091, Peoples R China
[2] Yunnan Univ, Natl Pilot Sch Software, Kunming 650091, Peoples R China
[3] ASTAR, Inst Infocomm Res I2R, Singapore 138632, Singapore
[4] Shanghai Univ, Sch Microelect, Shanghai, Peoples R China
关键词
Continual learning; Dynamic networks; Class incremental learning; Uncertainty inference;
D O I
10.1016/j.neunet.2022.10.030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although humans are capable of learning new tasks without forgetting previous ones, most neural networks fail to do so because learning new tasks could override the knowledge acquired from previous data. In this work, we alleviate this issue by proposing a novel Efficient Perturbation Inference and Expandable Network (EPIE-Net), which dynamically expands lightweight task-specific decoders for new classes and utilizes a mixed-label uncertainty strategy to improve the robustness. Moreover, we calculate the average probability of perturbed samples at inference, which can generally improve the performance of the model. Experimental results show that our method consistently outperforms other methods with fewer parameters in class incremental learning benchmarks. For example, on the CIFAR100 10 steps setup, our method achieves an average accuracy of 76.33% and the last accuracy of 65.93% within only 3.46M average parameters.(c) 2022 Published by Elsevier Ltd.
引用
收藏
页码:97 / 106
页数:10
相关论文
共 50 条
  • [21] Hessian Aware Low-Rank Perturbation for Order-Robust Continual Learning
    Li, Jiaqi
    Lai, Yuanhao
    Wang, Rui
    Shui, Changjian
    Sahoo, Sabyasachi
    Ling, Charles X.
    Yang, Shichun
    Wang, Boyu
    Gagne, Christian
    Zhou, Fan
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6385 - 6396
  • [22] Defending Network IDS against Adversarial Examples with Continual Learning
    Kozal, Jedrzej
    Zwolinska, Justyna
    Klonowski, Marek
    Wozniak, Michal
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 60 - 69
  • [23] Continual Active Learning for Efficient Adaptation of Machine Learning Models to Changing Image Acquisition
    Perkonigg, Matthias
    Hofmanninger, Johannes
    Langs, Georg
    INFORMATION PROCESSING IN MEDICAL IMAGING, IPMI 2021, 2021, 12729 : 649 - 660
  • [24] CLEO: Continual Learning of Evolving Ontologies
    Muralidhara, Shishir
    Bukhari, Saqib
    Schneider, Georg
    Stricker, Didier
    Schuster, Rene
    COMPUTER VISION - ECCV 2024, PT LIV, 2025, 15112 : 328 - 344
  • [25] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [26] Memory-Efficient Continual Learning Object Segmentation for Long Videos
    Nazemi, Amir
    Shafiee, Mohammad Javad
    Gharaee, Zahra
    Fieguth, Paul
    IEEE ACCESS, 2024, 12 : 97067 - 97084
  • [27] Efficient distributed continual learning for steering experiments in real-time
    Bouvier, Thomas
    Nicolae, Bogdan
    Costan, Alexandru
    Bicer, Tekin
    Foster, Ian
    Antoniu, Gabriel
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 162
  • [28] Continual compression model for online continual learning
    Ye, Fei
    Bors, Adrian G.
    APPLIED SOFT COMPUTING, 2024, 167
  • [29] JCBIE: a joint continual learning neural network for biomedical information extraction
    Kai He
    Rui Mao
    Tieliang Gong
    Erik Cambria
    Chen Li
    BMC Bioinformatics, 23
  • [30] Dynamic Continual Learning: Harnessing Parameter Uncertainty for Improved Network Adaptation
    Angelini, Christopher F.
    Bouaynaya, Nidhal C.
    2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024, 2024,