Instance-based meta-learning for conditionally dependent univariate multi-step forecasting☆

被引:0
作者
Cerqueira, Vitor [1 ]
Torgo, Luis [1 ]
Bontempi, Gianluca [2 ]
机构
[1] Dalhousie Univ, Fac Comp Sci, 6050 Univ Ave, Halifax, NS B3H 1W5, Canada
[2] Univ Libre Bruxelles, Dept Informat, Machine Learning Grp, Brussels, Belgium
关键词
Time series; Multi-step forecasting; Meta-learning; Gradient Boosting; k-nearest neighbors; SERIES; PREDICTION; STRATEGIES; MODELS;
D O I
10.1016/j.ijforecast.2023.12.010
中图分类号
F [经济];
学科分类号
02 ;
摘要
Multi-step prediction is a key challenge in univariate forecasting. However, forecasting accuracy decreases as predictions are made further into the future. This is caused by the decreasing predictability and the error propagation along the horizon. In this paper, we propose a novel method called Forecasted Trajectory Neighbors (FTN) for multi-step forecasting with univariate time series. FTN is a meta-learning strategy that can be integrated with any state-of-the-art multi-step forecasting approach. It works by using training observations to correct the errors made during multiple predictions. This is accomplished by retrieving the nearest neighbors of the multi-step forecasts and averaging these for prediction. The motivation is to introduce, in a lightweight manner, a conditional dependent constraint across the forecasting horizons. Such a constraint, not always taken into account by most strategies, can be considered as a sort of regularization element. We carried out extensive experiments using 7795 time series from different application domains. We found that our method improves the performance of several state-of-the-art multi-step forecasting methods. An implementation of the proposed method is publicly available online, and the experiments are reproducible. Crown Copyright (c) 2024 Published by Elsevier B.V. on behalf of International Institute of Forecasters. All rights reserved.
引用
收藏
页码:1507 / 1520
页数:14
相关论文
共 32 条
[1]   AN INTRODUCTION TO KERNEL AND NEAREST-NEIGHBOR NONPARAMETRIC REGRESSION [J].
ALTMAN, NS .
AMERICAN STATISTICIAN, 1992, 46 (03) :175-185
[2]   The theta model: a decomposition approach to forecasting [J].
Assimakopoulos, V ;
Nikolopoulos, K .
INTERNATIONAL JOURNAL OF FORECASTING, 2000, 16 (04) :521-530
[3]   The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances [J].
Bagnall, Anthony ;
Lines, Jason ;
Bostrom, Aaron ;
Large, James ;
Keogh, Eamonn .
DATA MINING AND KNOWLEDGE DISCOVERY, 2017, 31 (03) :606-660
[4]  
Ben Taieb Souhaib, 2011, Proceedings of the 2011 IEEE 11th International Conference on Data Mining (ICDM 2011), P695, DOI 10.1109/ICDM.2011.123
[5]   A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting [J].
Ben Taieb, Souhaib ;
Atiya, Amir F. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (01) :62-76
[6]   A review and comparison of strategies for multi-step ahead time series forecasting based on the NN5 forecasting competition [J].
Ben Taieb, Souhaib ;
Bontempi, Gianluca ;
Atiya, Amir F. ;
Sorjamaa, Antti .
EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (08) :7067-7083
[7]   Multiple-output modeling for multi-step-ahead time series forecasting [J].
Ben Taieb, Souhaib ;
Sorjamaa, Antti ;
Bontempi, Gianluca .
NEUROCOMPUTING, 2010, 73 (10-12) :1950-1957
[8]  
Benavoli A, 2017, J MACH LEARN RES, V18
[9]  
Bontempi G, 1999, MACHINE LEARNING, PROCEEDINGS, P32
[10]  
Bontempi G., 2000, P ESANN 2000, P311