Generative Models from the perspective of Continual Learning

被引:16
作者
Lesort, Timothee [1 ,2 ,3 ]
Caselles-Dupre, Hugo [1 ,2 ,4 ]
Garcia-Ortiz, Michael [4 ]
Stoian, Andrei [3 ]
Filliat, David [1 ,2 ]
机构
[1] ENSTA ParisTech, Flowers Team, Palaiseau, France
[2] INRIA, Paris, France
[3] Thales, Theresis Laboratory, La Defens, France
[4] Softbank Robot Europe, Paris, France
来源
2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2019年
关键词
D O I
10.1109/ijcnn.2019.8851986
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Which generative model is the most suitable for Continual Learning? This paper aims at evaluating and comparing generative models on disjoint sequential image generation tasks. We investigate how several models learn and forget, considering various strategies: rehearsal, regularization, generative replay and fine-tuning. We used two quantitative metrics to estimate the generation quality and memory ability. We experiment with sequential tasks on three commonly used benchmarks for Continual Learning (MNIST, Fashion MNIST and CIFAR10). We found that among all models, the original GAN performs best and among Continual Learning strategies, generative replay outperforms all other methods. Even if we found satisfactory combinations on MNIST and Fashion MNIST, training generative models sequentially on CIFAR10 is particularly instable, and remains a challenge. Our code is available online(1).
引用
收藏
页数:8
相关论文
共 50 条
  • [41] Continual Learning with Diffusion-based Generative Replay for Industrial Streaming Data
    He, Jiayi
    Chen, Jiao
    Liu, Qianmiao
    Dai, Suyan
    Tang, Jianhua
    Liu, Dongpo
    2024 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA, ICCC, 2024,
  • [42] Looking Through the Past: Better Knowledge Retention for Generative Replay in Continual Learning
    Khan, Valeriya
    Cygert, Sebastian
    Deja, Kamil
    Trzcinski, Tomasz
    Twardowski, Bartlomiej
    IEEE ACCESS, 2024, 12 : 45309 - 45317
  • [43] Evaluating and Explaining Generative Adversarial Networks for Continual Learning under Concept Drift
    Guzy, Filip
    Wozniak, Michal
    Krawczyk, Bartosz
    21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 295 - 303
  • [44] Recall-Oriented Continual Learning with Generative Adversarial Meta-Model
    Kang, Haneol
    Choi, Dong-Wan
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13040 - 13048
  • [45] Looking through the past: better knowledge retention for generative replay in continual learning
    Khan, Valeriya
    Cygert, Sebastian
    Twardowski, Bartlomiej
    Trzcinski, Tomasz
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3488 - 3492
  • [46] Analysis of Continual Learning Models for Intrusion Detection System
    Prasath, Sai
    Sethi, Kamalakanta
    Mohanty, Dinesh
    Bera, Padmalochan
    Samantaray, Subhransu Ranjan
    IEEE ACCESS, 2022, 10 : 121444 - 121464
  • [47] Explicit Disentanglement of Appearance and Perspective in Generative Models
    Detlefsen, Nicki S.
    Hauberg, Soren
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [48] Continual Learning with Pre-Trained Models: A Survey
    Zhou, Da-Wei
    Sun, Hai-Long
    Ning, Jingyi
    Ye, Han-Jia
    Zhan, De-Chuan
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8363 - 8371
  • [49] Continual Learning for Multi-Dialect Acoustic Models
    Houston, Brady
    Kirchhoff, Katrin
    INTERSPEECH 2020, 2020, : 576 - 580
  • [50] Generative chemistry: drug discovery with deep learning generative models
    Yuemin Bian
    Xiang-Qun Xie
    Journal of Molecular Modeling, 2021, 27