Single-Head Lifelong Learning Based on Distilling Knowledge

被引:4
|
作者
Wang, Yen-Hsiang [1 ]
Lin, Chih-Yang [2 ]
Thaipisutikul, Tipajin [3 ]
Shih, Timothy K. [1 ]
机构
[1] Natl Cent Univ, Dept Comp Sci & Informat Engn, Taoyuan 320, Taiwan
[2] Yuan Ze Univ, Dept Elect Engn, Taoyuan 32003, Taiwan
[3] Mahidol Univ, Fac Informat & Commun Technol, Salaya 73170, Thailand
来源
IEEE ACCESS | 2022年 / 10卷
关键词
Task analysis; Neural networks; Training; Knowledge engineering; Data models; Testing; Predictive models; Lifelong learning; continuous learning; incremental learning; knowledge distillation; IMBALANCED DATA;
D O I
10.1109/ACCESS.2022.3155451
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Within the machine learning field, the main purpose of lifelong learning, also known as continuous learning, is to enable neural networks to learn continuously, as humans do. Lifelong learning accumulates the knowledge learned from previous tasks and transfers it to support the neural network in future tasks. This technique not only avoids the catastrophic forgetting problem with previous tasks when training new tasks, but also makes the model more robust with the temporal evolution. Motivated by the recent intervention of the lifelong learning technique, this paper presents a novel feature-based knowledge distillation method that differs from the existing methods of knowledge distillation in lifelong learning. Specifically, our proposed method utilizes the features from intermediate layers and compresses them in a unique way that involves global average pooling and fully connected layers. The authors then use the output of this branch network to deliver information from previous tasks to the model in the future. Extensive experiments show that our proposed model consistency outperforms the state-of-the-art baselines with the accuracy metric by at least two percent improvement under different experimental settings.
引用
收藏
页码:35469 / 35478
页数:10
相关论文
共 50 条
  • [1] Lifelong Language Learning With the Most Forgotten Knowledge
    Choi, Heejeong
    Kang, Pilsung
    IEEE ACCESS, 2021, 9 : 57941 - 57948
  • [2] Light-Weight Deformable Registration Using Adversarial Learning With Distilling Knowledge
    Tran, Minh Q.
    Tuong Do
    Huy Tran
    Tjiputra, Erman
    Tran, Quang D.
    Anh Nguyen
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2022, 41 (06) : 1443 - 1453
  • [3] Distilling Knowledge in Federated Learning
    Le, Huy Q.
    Shin, Jong Hoon
    Nguyen, Minh N. H.
    Hong, Choong Seon
    2021 22ND ASIA-PACIFIC NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM (APNOMS), 2021, : 196 - 201
  • [4] Distilling Knowledge Based on Curriculum Learning for Temporal Knowledge Graph Embeddings
    Zhang, Bin
    Li, Jiayin
    Dai, Yuanfei
    PROCEEDINGS OF THE 33RD ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2024, 2024, : 4248 - 4252
  • [5] Single Image Reflection Removal Based on Knowledge-Distilling Content Disentanglement
    Peng, Yan-Tsung
    Cheng, Kai-Han
    Fang, I-Sheng
    Peng, Wen-Yi
    Wu, Jr-Shian
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 568 - 572
  • [6] Lifelong Learning With Adaptive Knowledge Fusion and Class Margin Dynamic Adjustment for Hyperspectral Image Classification
    Jiang, Zihui
    Li, Zhaokui
    Wang, Yan
    Li, Wei
    Wang, Ke
    Tian, Jing
    Wang, Chuanyun
    Du, Qian
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [7] Causal Inference with Knowledge Distilling and Curriculum Learning for Unbiased VQA
    Pan, Yonghua
    Li, Zechao
    Zhang, Liyan
    Tang, Jinhui
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2022, 18 (03)
  • [8] Rumors Detection Based on Lifelong Machine Learning
    He, Xiaoxia
    Tuerhong, Gulanbaier
    Wushouer, Mairidan
    Xin, Dai
    IEEE ACCESS, 2022, 10 : 25605 - 25620
  • [9] Knowledge distilling based model compression and feature learning in fault diagnosis
    Zhang, Wenfeng
    Biswas, Gautam
    Zhao, Qi
    Zhao, Hongbo
    Feng, Wenquan
    APPLIED SOFT COMPUTING, 2020, 88
  • [10] Steps toward Lifelong Learning and Knowledge
    Bordeianu, Otilia-Maria
    Morosan-Danila, Lucia
    VISION 2020: SUSTAINABLE GROWTH, ECONOMIC DEVELOPMENT, AND GLOBAL COMPETITIVENESS, VOLS 1-5, 2014, : 1758 - 1766