Continual learning in the presence of repetition

被引:0
|
作者
Hemati, Hamed [1 ]
Pellegrini, Lorenzo [2 ]
Duan, Xiaotian [3 ,4 ]
Zhao, Zixuan [3 ,4 ]
Xia, Fangfang [3 ,4 ]
Masana, Marc [5 ,6 ]
Tscheschner, Benedikt [5 ,7 ]
Veas, Eduardo [5 ,7 ]
Zheng, Yuxiang [8 ]
Zhao, Shiji [8 ]
Li, Shao-Yuan [8 ]
Huang, Sheng-Jun [8 ]
Lomonaco, Vincenzo [9 ]
van de Ven, Gido M. [10 ]
机构
[1] Univ St Gallen, Inst Comp Sci, Rosenbergstr 30, CH-9000 St Gallen, Switzerland
[2] Univ Bologna, Dept Comp Sci, Via Univ 50, I-47521 Cesena, Italy
[3] Univ Chicago, 5801 S Ellis Ave, Chicago, IL 60637 USA
[4] Argonne Natl Lab, 9700 S Cass Ave, Lemont, IL 60439 USA
[5] Graz Univ Technol, Rechbauerstr 12, A-8010 Graz, Austria
[6] TU Graz SAL Dependable Embedded Syst Lab, Silicon Austria Labs, A-8010 Graz, Austria
[7] Know Ctr GmbH, Sandgasse 36-4, A-8010 Graz, Austria
[8] Nanjing Univ Aeronaut & Astronaut, MIIT Key Lab Pattern Anal & Machine Intelligence, Nanjing 211106, Peoples R China
[9] Univ Pisa, Dept Comp Sci, Largo Bruno Pontecorvo 3, I-56127 Pisa, Italy
[10] Katholieke Univ Leuven, Dept Elect Engn, Kasteelpark Arenberg 10, B-3001 Leuven, Belgium
基金
比利时弗兰德研究基金会;
关键词
Continual learning; Class-incremental learning; Repetition; Competition; MEMORY;
D O I
10.1016/j.neunet.2024.106920
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning (CL) provides a framework for training models in ever-evolving environments. Although re-occurrence of previously seen objects or tasks is common in real-world problems, the concept of repetition in the data stream is not often considered in standard benchmarks for CL. Unlike with the rehearsal mechanism in buffer-based strategies, where sample repetition is controlled by the strategy, repetition in the data stream naturally stems from the environment. This report provides a summary of the CLVision challenge at CVPR 2023, which focused on the topic of repetition in class-incremental learning. The report initially outlines the challenge objective and then describes three solutions proposed by finalist teams that aim to effectively exploit the repetition in the stream to learn continually. The experimental results from the challenge highlight the effectiveness of ensemble-based solutions that employ multiple versions of similar modules, each trained on different but overlapping subsets of classes. This report underscores the transformative potential of taking a different perspective in CL by employing repetition in the data stream to foster innovative strategy design.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Efficient Perturbation Inference and Expandable Network for continual learning
    Du, Fei
    Yang, Yun
    Zhao, Ziyuan
    Zeng, Zeng
    NEURAL NETWORKS, 2023, 159 : 97 - 106
  • [22] Memory Bounds for Continual Learning
    Chen, Xi
    Papadimitriou, Christos
    Peng, Binghui
    2022 IEEE 63RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2022, : 519 - 530
  • [23] Reinforced Continual Learning for Graphs
    Rakaraddi, Appan
    Kei, Lam Siew
    Pratama, Mahardhika
    de Carvalho, Marcus
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 1666 - 1674
  • [24] Continual learning with selective nets
    Luu, Hai Tung
    Szemenyei, Marton
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [25] Advances and Trends of Continual Learning
    Li, Wenbin
    Xiong, Yakun
    Fan, Zhichen
    Deng, Bo
    Cao, Fuyuan
    Gao, Yang
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (06): : 1476 - 1496
  • [26] Continual Learning, Fast and Slow
    Pham, Quang
    Liu, Chenghao
    Hoi, Steven C. H.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (01) : 134 - 149
  • [27] Continual Information Cascade Learning
    Zhou, Fan
    Jing, Xin
    Xu, Xovee
    Zhong, Ting
    Trajcevski, Goce
    Wu, Jin
    2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [28] Adaptive Progressive Continual Learning
    Xu, Ju
    Ma, Jin
    Gao, Xuesong
    Zhu, Zhanxing
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (10) : 6715 - 6728
  • [29] Progressive learning: A deep learning framework for continual learning
    Fayek, Haytham M.
    Cavedon, Lawrence
    Wu, Hong Ren
    NEURAL NETWORKS, 2020, 128 : 345 - 357
  • [30] CONTINUAL LEARNING IN VISION TRANSFORMER
    Takeda, Mana
    Yanai, Keiji
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 616 - 620