Continual learning in the presence of repetition

被引:0
|
作者
Hemati, Hamed [1 ]
Pellegrini, Lorenzo [2 ]
Duan, Xiaotian [3 ,4 ]
Zhao, Zixuan [3 ,4 ]
Xia, Fangfang [3 ,4 ]
Masana, Marc [5 ,6 ]
Tscheschner, Benedikt [5 ,7 ]
Veas, Eduardo [5 ,7 ]
Zheng, Yuxiang [8 ]
Zhao, Shiji [8 ]
Li, Shao-Yuan [8 ]
Huang, Sheng-Jun [8 ]
Lomonaco, Vincenzo [9 ]
van de Ven, Gido M. [10 ]
机构
[1] Univ St Gallen, Inst Comp Sci, Rosenbergstr 30, CH-9000 St Gallen, Switzerland
[2] Univ Bologna, Dept Comp Sci, Via Univ 50, I-47521 Cesena, Italy
[3] Univ Chicago, 5801 S Ellis Ave, Chicago, IL 60637 USA
[4] Argonne Natl Lab, 9700 S Cass Ave, Lemont, IL 60439 USA
[5] Graz Univ Technol, Rechbauerstr 12, A-8010 Graz, Austria
[6] TU Graz SAL Dependable Embedded Syst Lab, Silicon Austria Labs, A-8010 Graz, Austria
[7] Know Ctr GmbH, Sandgasse 36-4, A-8010 Graz, Austria
[8] Nanjing Univ Aeronaut & Astronaut, MIIT Key Lab Pattern Anal & Machine Intelligence, Nanjing 211106, Peoples R China
[9] Univ Pisa, Dept Comp Sci, Largo Bruno Pontecorvo 3, I-56127 Pisa, Italy
[10] Katholieke Univ Leuven, Dept Elect Engn, Kasteelpark Arenberg 10, B-3001 Leuven, Belgium
基金
比利时弗兰德研究基金会;
关键词
Continual learning; Class-incremental learning; Repetition; Competition; MEMORY;
D O I
10.1016/j.neunet.2024.106920
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Continual learning (CL) provides a framework for training models in ever-evolving environments. Although re-occurrence of previously seen objects or tasks is common in real-world problems, the concept of repetition in the data stream is not often considered in standard benchmarks for CL. Unlike with the rehearsal mechanism in buffer-based strategies, where sample repetition is controlled by the strategy, repetition in the data stream naturally stems from the environment. This report provides a summary of the CLVision challenge at CVPR 2023, which focused on the topic of repetition in class-incremental learning. The report initially outlines the challenge objective and then describes three solutions proposed by finalist teams that aim to effectively exploit the repetition in the stream to learn continually. The experimental results from the challenge highlight the effectiveness of ensemble-based solutions that employ multiple versions of similar modules, each trained on different but overlapping subsets of classes. This report underscores the transformative potential of taking a different perspective in CL by employing repetition in the data stream to foster innovative strategy design.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] CLEO: Continual Learning of Evolving Ontologies
    Muralidhara, Shishir
    Bukhari, Saqib
    Schneider, Georg
    Stricker, Didier
    Schuster, Rene
    COMPUTER VISION - ECCV 2024, PT LIV, 2025, 15112 : 328 - 344
  • [2] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [3] Continual compression model for online continual learning
    Ye, Fei
    Bors, Adrian G.
    APPLIED SOFT COMPUTING, 2024, 167
  • [4] Logarithmic Continual Learning
    Masarczyk, Wojciech
    Wawrzynski, Pawel
    Marczak, Daniel
    Deja, Kamil
    Trzcinski, Tomasz
    IEEE ACCESS, 2022, 10 : 117001 - 117010
  • [5] Bilevel Continual Learning
    Shaker, Ammar
    Alesiani, Francesco
    Yu, Shujian
    Yin, Wenzhe
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Context and repetition in word learning
    Horst, Jessica S.
    FRONTIERS IN PSYCHOLOGY, 2013, 4
  • [7] SpaceNet: Make Free Space for Continual Learning
    Sokar, Ghada
    Mocanu, Decebal Constantin
    Pechenizkiy, Mykola
    NEUROCOMPUTING, 2021, 439 : 1 - 11
  • [8] Comparing continual task learning in minds and machines
    Flesch, Timo
    Balaguer, Jan
    Dekker, Ronald
    Nili, Hamed
    Summerfield, Christopher
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018, 115 (44) : E10313 - E10322
  • [9] Continual lifelong learning with neural networks: A review
    Parisi, German I.
    Kemker, Ronald
    Part, Jose L.
    Kanan, Christopher
    Wermter, Stefan
    NEURAL NETWORKS, 2019, 113 : 54 - 71
  • [10] Continual learning in medical image analysis: A survey
    Wu, Xinyao
    Xu, Zhe
    Tong, Raymond Kai-yu
    Computers in Biology and Medicine, 2024, 182