CLeaR: An adaptive continual learning framework for regression tasks

被引:5
作者
Yujiang He
Bernhard Sick
机构
[1] University of Kassel,Intelligent Embedded Systems (IES) Group
来源
AI Perspectives | / 3卷 / 1期
关键词
Continual learning; Renewable energy forecasts; Regression tasks; Deep neural networks;
D O I
10.1186/s42467-021-00009-8
中图分类号
学科分类号
摘要
Catastrophic forgetting means that a trained neural network model gradually forgets the previously learned tasks when being retrained on new tasks. Overcoming the forgetting problem is a major problem in machine learning. Numerous continual learning algorithms are very successful in incremental learning of classification tasks, where new samples with their labels appear frequently. However, there is currently no research that addresses the catastrophic forgetting problem in regression tasks as far as we know. This problem has emerged as one of the primary constraints in some applications, such as renewable energy forecasts. This article clarifies problem-related definitions and proposes a new methodological framework that can forecast targets and update itself by means of continual learning. The framework consists of forecasting neural networks and buffers, which store newly collected data from a non-stationary data stream in an application. The changed probability distribution of the data stream, which the framework has identified, will be learned sequentially. The framework is called CLeaR (Continual Learning for Regression Tasks), where components can be flexibly customized for a specific application scenario. We design two sets of experiments to evaluate the CLeaR framework concerning fitting error (training), prediction error (test), and forgetting ratio. The first one is based on an artificial time series to explore how hyperparameters affect the CLeaR framework. The second one is designed with data collected from European wind farms to evaluate the CLeaR framework’s performance in a real-world application. The experimental results demonstrate that the CLeaR framework can continually acquire knowledge in the data stream and improve the prediction accuracy. The article concludes with further research issues arising from requirements to extend the framework.
引用
收藏
相关论文
共 26 条
[1]  
McCloskey M(1989)Catastrophic interference in connectionist networks: The sequential learning problem Psychol Learn Motiv 24 109-65
[2]  
Cohen NJ(1990)Connectionist models of recognition memory: constraints imposed by learning and forgetting functions Psychol Rev 97 285-6
[3]  
Ratcliff R(2013)The stability-plasticity dilemma: Investigating the continuum from catastrophic forgetting to age-limited learning effects Front Psychol 4 504-73
[4]  
Mermillod M(2017)Overcoming catastrophic forgetting in neural networks Proc Natl Acad Sci 114 3521-47
[5]  
Bugaiska A(2017)Continual learning through synaptic intelligence Proc Mach Learn Res 70 3987-82
[6]  
Bonin P(2019)Continuous learning in single-incremental-task scenarios Neural Netw 116 56-undefined
[7]  
Kirkpatrick J(2017)Learning without forgetting IEEE Trans Pattern Anal Mach Intell 40 2935-undefined
[8]  
Pascanu R(2020)Continuous learning of deep neural networks to improve forecasts for regional energy markets IFAC-PapersOnLine 53 12175-undefined
[9]  
Rabinowitz N(undefined)undefined undefined undefined undefined-undefined
[10]  
Veness J(undefined)undefined undefined undefined undefined-undefined