PILOT: a pre-trained model-based continual learning toolbox

被引:0
|
作者
HaiLong SUN [1 ,2 ]
DaWei ZHOU [1 ,2 ]
DeChuan ZHAN [1 ,2 ]
HanJia YE [1 ,2 ]
机构
[1] School of Artificial Intelligence, Nanjing University
[2] National Key Laboratory for Novel Software Technology, Nanjing
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
<正>The rapid advancements in deep learning have resulted in significant achievements across various fields. However, our ever-changing world often presents training data in a streaming format from an open environment. For example, while ChatGPT demonstrates exceptional inference capabilities,it struggles to provide users with the most up-to-date information. This challenge arises from the high costs associated with retraining a GPT model on new data daily.
引用
收藏
页码:383 / 384
页数:2
相关论文
共 50 条
  • [1] PILOT: a pre-trained model-based continual learning toolbox
    Sun, Hai-Long
    Zhou, Da-Wei
    Zhan, De-Chuan
    Ye, Han-Jia
    SCIENCE CHINA-INFORMATION SCIENCES, 2025, 68 (04)
  • [2] Continual learning with Bayesian model based on a fixed pre-trained feature extractor
    Yang Yang
    Zhiying Cui
    Junjie Xu
    Changhong Zhong
    Wei-Shi Zheng
    Ruixuan Wang
    Visual Intelligence, 1 (1):
  • [3] Continual Learning with Bayesian Model Based on a Fixed Pre-trained Feature Extractor
    Yang, Yang
    Cui, Zhiying
    Xu, Junjie
    Zhong, Changhong
    Wang, Ruixuan
    Zheng, Wei-Shi
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT V, 2021, 12905 : 397 - 406
  • [4] Continual Learning with Pre-Trained Models: A Survey
    Zhou, Da-Wei
    Sun, Hai-Long
    Ning, Jingyi
    Ye, Han-Jia
    Zhan, De-Chuan
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8363 - 8371
  • [5] SLCA: Slow Learner with Classifier Alignment for Continual Learning on a Pre-trained Model
    Zhang, Gengwei
    Wang, Liyuan
    Kang, Guoliang
    Chen, Ling
    Wei, Yunchao
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 19091 - 19101
  • [6] RanPAC: Random Projections and Pre-trained Models for Continual Learning
    McDonnell, Mark D.
    Gong, Dong
    Parveneh, Amin
    Abbasnejad, Ehsan
    van den Hengel, Anton
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Do Pre-trained Models Benefit Equally in Continual Learning?
    Lee, Kuan-Ying
    Zhong, Yuanyi
    Wang, Yu-Xiong
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6474 - 6482
  • [8] Pre-Trained Language Model-Based Deep Learning for Sentiment Classification of Vietnamese Feedback
    Loc, Cu Vinh
    Viet, Truong Xuan
    Viet, Tran Hoang
    Thao, Le Hoang
    Viet, Nguyen Hoang
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2023, 22 (03)
  • [9] Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning
    Zhou, Da-Wei
    Sun, Hai-Long
    Ye, Han-Jia
    Zhan, De-Chuan
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 23554 - 23564
  • [10] Pre-trained Language Model-based Retrieval and Ranking forWeb Search
    Zou, Lixin
    Lu, Weixue
    Liu, Yiding
    Cai, Hengyi
    Chu, Xiaokai
    Ma, Dehong
    Shi, Daiting
    Sun, Yu
    Cheng, Zhicong
    Gu, Simiu
    Wang, Shuaiqiang
    Yin, Dawei
    ACM TRANSACTIONS ON THE WEB, 2023, 17 (01)