Distilled Meta-learning for Multi-Class Incremental Learning

被引:2
|
作者
Liu, Hao [1 ]
Yan, Zhaoyu [1 ]
Liu, Bing [1 ]
Zhao, Jiaqi [1 ]
Zhou, Yong [1 ]
El Saddik, Abdulmotaleb [2 ]
机构
[1] China Univ Min & Technol, 1 Daxue Rd, Xuzhou 221116, Jiangsu, Peoples R China
[2] Univ Ottawa, Sch Elect Engn & Comp Sci, 800 King Edward, Ottawa, ON K1N 6N5, Canada
基金
中国国家自然科学基金;
关键词
Incremental learning; meta-learning; knowledge distillation; catastrophic forgetting; stability-plasticity dilemma;
D O I
10.1145/3576045
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Meta-learning approaches have recently achieved promising performance inmulti-class incremental learning. However, meta-learners still suffer from catastrophic forgetting, i.e., they tend to forget the learned knowledge from the old tasks when they focus on rapidly adapting to the new classes of the current task. To solve this problem, we propose a novel distilled meta-learning (DML) framework for multi-class incremental learning that integrates seamlessly meta-learning with knowledge distillation in each incremental stage. Specifically, during inner-loop training, knowledge distillation is incorporated into the DML to overcome catastrophic forgetting. During outer-loop training, a meta-update rule is designed for the meta-learner to learn across tasks and quickly adapt to new tasks. By virtue of the bilevel optimization, our model is encouraged to reach a balance between the retention of old knowledge and the learning of new knowledge. Experimental results on four benchmark datasets demonstrate the effectiveness of our proposal and show that our method significantly outperforms other state-of-the-art incremental learning methods.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Meta-learning for fast incremental learning
    Oohira, T
    Yamauchi, K
    Omori, T
    ARTIFICAIL NEURAL NETWORKS AND NEURAL INFORMATION PROCESSING - ICAN/ICONIP 2003, 2003, 2714 : 157 - 164
  • [2] A Novel Incremental Class Learning Technique for Multi-class Classification
    Er, Meng Joo
    Yalavarthi, Vijaya Krishna
    Wang, Ning
    Venkatesan, Rajasekar
    ADVANCES IN NEURAL NETWORKS - ISNN 2016, 2016, 9719 : 474 - 481
  • [3] MetaFSCEL A Meta-Learning Approach for Few-Shot Class Incremental Learning
    Chi, Zhixiang
    Gu, Li
    Liu, Huan
    Wang, Yang
    Yu, Yuanhao
    Tang, Jin
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 14146 - 14155
  • [4] MetaZSCIL: A Meta-Learning Approach for Generalized Zero-Shot Class Incremental Learning
    Wu, Yanan
    Liang, Tengfei
    Feng, Songhe
    Jin, Yi
    Lyu, Gengyu
    Fei, Haojun
    Wang, Yang
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 10408 - 10416
  • [5] CILDI: Class Incremental Learning with Distilled Images
    Zacarias, Abel
    Alexandre, Luis A.
    2022 9TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2022, : 157 - 161
  • [6] Incremental Learning and Novelty Detection of Gestures in a Multi-Class System
    Al-Behadili, Husam
    Grumpe, Arne
    Woehler, Christian
    2015 THIRD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, MODELLING AND SIMULATION (AIMS 2015), 2015, : 304 - 309
  • [7] Meta-Seg: A Generalized Meta-Learning Framework for Multi-Class Few-Shot Semantic Segmentation
    Cao, Zhiying
    Zhang, Tengfei
    Diao, Wenhui
    Zhang, Yue
    Lyu, Xiaode
    Fu, Kun
    Sun, Xian
    IEEE ACCESS, 2019, 7 : 166109 - 166121
  • [8] Meta-learning for real-world class incremental learning: a transformer-based approach
    Kumar, Sandeep
    Sharma, Amit
    Shokeen, Vikrant
    Azar, Ahmad Taher
    Amin, Syed Umar
    Khan, Zafar Iqbal
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [9] A meta-learning network method for few-shot multi-class classification problems with numerical data
    Wu, Lang
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (02) : 2639 - 2652
  • [10] A meta-learning network method for few-shot multi-class classification problems with numerical data
    Lang Wu
    Complex & Intelligent Systems, 2024, 10 : 2639 - 2652