Single-Head Lifelong Learning Based on Distilling Knowledge

被引:4
|
作者
Wang, Yen-Hsiang [1 ]
Lin, Chih-Yang [2 ]
Thaipisutikul, Tipajin [3 ]
Shih, Timothy K. [1 ]
机构
[1] Natl Cent Univ, Dept Comp Sci & Informat Engn, Taoyuan 320, Taiwan
[2] Yuan Ze Univ, Dept Elect Engn, Taoyuan 32003, Taiwan
[3] Mahidol Univ, Fac Informat & Commun Technol, Salaya 73170, Thailand
来源
IEEE ACCESS | 2022年 / 10卷
关键词
Task analysis; Neural networks; Training; Knowledge engineering; Data models; Testing; Predictive models; Lifelong learning; continuous learning; incremental learning; knowledge distillation; IMBALANCED DATA;
D O I
10.1109/ACCESS.2022.3155451
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Within the machine learning field, the main purpose of lifelong learning, also known as continuous learning, is to enable neural networks to learn continuously, as humans do. Lifelong learning accumulates the knowledge learned from previous tasks and transfers it to support the neural network in future tasks. This technique not only avoids the catastrophic forgetting problem with previous tasks when training new tasks, but also makes the model more robust with the temporal evolution. Motivated by the recent intervention of the lifelong learning technique, this paper presents a novel feature-based knowledge distillation method that differs from the existing methods of knowledge distillation in lifelong learning. Specifically, our proposed method utilizes the features from intermediate layers and compresses them in a unique way that involves global average pooling and fully connected layers. The authors then use the output of this branch network to deliver information from previous tasks to the model in the future. Extensive experiments show that our proposed model consistency outperforms the state-of-the-art baselines with the accuracy metric by at least two percent improvement under different experimental settings.
引用
收藏
页码:35469 / 35478
页数:10
相关论文
共 50 条
  • [11] Catastrophic Interference in Reinforcement Learning: A Solution Based on Context Division and Knowledge Distillation
    Zhang, Tiantian
    Wang, Xueqian
    Liang, Bin
    Yuan, Bo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 9925 - 9939
  • [12] Generous teacher: Good at distilling knowledge for student learning
    Ding, Yifeng
    Yang, Gaoming
    Yin, Shuting
    Zhang, Ji
    Fang, Xianjin
    Yang, Wencheng
    IMAGE AND VISION COMPUTING, 2024, 150
  • [13] The Lifelong Learning in the University: Learning Networks and Knowledge Transferring
    Ardimento, Pasquale
    Boffoli, Nicola
    Convertini, Vito Nicola
    Visaggio, Giuseppe
    JOURNAL OF E-LEARNING AND KNOWLEDGE SOCIETY, 2011, 7 (01): : 21 - +
  • [14] Lifelong Learning in the Knowledge Economy: Considerations on the Lifelong Learning System in Romania from a European Perspective
    Popescu, Alina Irina
    REVISTA DE CERCETARE SI INTERVENTIE SOCIALA, 2012, 37 : 49 - 76
  • [15] Models of Lifelong Learning and the 'Knowledge Society' in Europe
    Green, Andy
    PAPERS-REVISTA DE SOCIOLOGIA, 2011, 96 (04): : 1075 - 1096
  • [16] Academy's Role in Fostering Lifelong Learning and Self-Development in a Knowledge-Based Society
    Dorfman-Furman, Gabriela
    VIII IEEE WORLD ENGINEERING EDUCATION CONFERENCE, EDUNINE 2024, 2024,
  • [17] PSYCHOLOGY, ADVERTISING AND LIFELONG LEARNING IN THE KNOWLEDGE SOCIETY
    Stefanescu, Ramona Olivia
    Mustata, Marinel-Adi
    ANYWHERE, ANYTIME - EDUCATION ON DEMAND, VOL II, 2011, : 160 - 166
  • [18] LD-PA: Distilling Univariate Leakage for Deep Learning-Based Profiling Attacks
    Xiao, Chong
    Tang, Ming
    Karayalcin, Sengim
    Cheng, Wei
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2025, 20 : 17 - 30
  • [19] Rectification-Based Knowledge Retention for Task Incremental Learning
    Mazumder, Pratik
    Singh, Pravendra
    Rai, Piyush
    Namboodiri, Vinay P.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1561 - 1575
  • [20] Task-Based Neuromodulation Architecture for Lifelong Learning
    Daram, Anurag Reddy
    Kudithipudi, Dhireesha
    Yanguas-Gil, Angel
    PROCEEDINGS OF THE 2019 20TH INTERNATIONAL SYMPOSIUM ON QUALITY ELECTRONIC DESIGN (ISQED), 2019, : 191 - 197