Forecasting performance of machine learning, time series, and hybrid methods for low- and high-frequency time series

被引:2
作者
Ozdemir, Ozancan [1 ]
Yozgatligil, Ceylan [1 ]
机构
[1] Middle East Tech Univ, Dept Stat, TR-06800 Ankara, Turkiye
关键词
forecasting; hybrid method; machine learning; time series analysis;
D O I
10.1111/stan.12326
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
One of the main objectives of the time series analysis is forecasting, so both Machine Learning methods and statistical methods have been proposed in the literature. In this study, we compare the forecasting performance of some of these approaches. In addition to traditional forecasting methods, which are the Naive and Seasonal Naive Methods, S/ARIMA, Exponential Smoothing, TBATS, Bayesian Exponential Smoothing Models with Trend Modifications and STL Decomposition, the forecasts are also obtained using seven different machine learning methods, which are Random Forest, Support Vector Regression, XGBoosting, BNN, RNN, LSTM, and FFNN, and the hybridization of both statistical time series and machine learning methods. The data set is selected proportionally from various time domains in M4 Competition data set. Thereby, we aim to create a forecasting guide by considering different preprocessing approaches, methods, and data sets having various time domains. After the experiment, the performance and impact of all methods are discussed. Therefore, most of the best models are mainly selected from machine learning methods for forecasting. Moreover, the forecasting performance of the model is affected by both the time frequency and forecast horizon. Lastly, the study suggests that the hybrid approach is not always the best model for forecasting. Hence, this study provides guidelines to understand which method will perform better at different time series frequencies.
引用
收藏
页码:441 / 474
页数:34
相关论文
共 41 条
  • [1] Box G.E.P., 1976, Time Series Analysis: Forecasting and Control
  • [2] Random forests
    Breiman, L
    [J]. MACHINE LEARNING, 2001, 45 (01) : 5 - 32
  • [3] XGBoost: A Scalable Tree Boosting System
    Chen, Tianqi
    Guestrin, Carlos
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 785 - 794
  • [4] Cleveland R. B., 1990, J OFF STAT, V6, P3, DOI DOI 10.1007/978-1-4613-4499-5_24
  • [5] Anticipating bank distress in the Eurozone: An Extreme Gradient Boosting approach
    Climent, Francisco
    Momparler, Alexandre
    Carmona, Pedro
    [J]. JOURNAL OF BUSINESS RESEARCH, 2019, 101 : 885 - 896
  • [6] Forecasting Time Series With Complex Seasonal Patterns Using Exponential Smoothing
    De Livera, Alysha M.
    Hyndman, Rob J.
    Snyder, Ralph D.
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2011, 106 (496) : 1513 - 1527
  • [7] Ellis P., 2019, FORECASTXGB TIME SER
  • [8] Fischer T. G., 2018, FAU Discussion Papers in Economics
  • [9] Greedy function approximation: A gradient boosting machine
    Friedman, JH
    [J]. ANNALS OF STATISTICS, 2001, 29 (05) : 1189 - 1232
  • [10] Gelman A, 2004, Bayesian Data Analysis, Vsecond