Transformer-Based Models for Probabilistic Time Series Forecasting with Explanatory Variables

被引:0
|
作者
Caetano, Ricardo [1 ]
Oliveira, Jose Manuel [2 ,3 ]
Ramos, Patricia [2 ,4 ]
机构
[1] Polytech Porto, ISCAP, Rua Jaime Lopes Amorim S-N, P-4465004 Sao Mamede De Infesta, Portugal
[2] Inst Syst & Comp Engn Technol & Sci, Campus FEUP,Rua Dr Roberto Frias, P-4200465 Porto, Portugal
[3] Univ Porto, Fac Econ, Rua Dr Roberto Frias, P-4200464 Porto, Portugal
[4] Polytech Porto, CEOS PP, ISCAP, Rua Jaime Lopes Amorim S-N, P-4465004 Sao Mamede De Infesta, Portugal
关键词
transformers; time series; probabilistic forecasting; retail; covariates; deep learning; data-driven decision making; SALES; FASHION;
D O I
10.3390/math13050814
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Accurate demand forecasting is essential for retail operations as it directly impacts supply chain efficiency, inventory management, and financial performance. However, forecasting retail time series presents significant challenges due to their irregular patterns, hierarchical structures, and strong dependence on external factors such as promotions, pricing strategies, and socio-economic conditions. This study evaluates the effectiveness of Transformer-based architectures, specifically Vanilla Transformer, Informer, Autoformer, ETSformer, NSTransformer, and Reformer, for probabilistic time series forecasting in retail. A key focus is the integration of explanatory variables, such as calendar-related indicators, selling prices, and socio-economic factors, which play a crucial role in capturing demand fluctuations. This study assesses how incorporating these variables enhances forecast accuracy, addressing a research gap in the comprehensive evaluation of explanatory variables within multiple Transformer-based models. Empirical results, based on the M5 dataset, show that incorporating explanatory variables generally improves forecasting performance. Models leveraging these variables achieve up to 12.4% reduction in Normalized Root Mean Squared Error (NRMSE) and 2.9% improvement in Mean Absolute Scaled Error (MASE) compared to models that rely solely on past sales. Furthermore, probabilistic forecasting enhances decision making by quantifying uncertainty, providing more reliable demand predictions for risk management. These findings underscore the effectiveness of Transformer-based models in retail forecasting and emphasize the importance of integrating domain-specific explanatory variables to achieve more accurate, context-aware predictions in dynamic retail environments.
引用
收藏
页数:29
相关论文
共 50 条
  • [41] Time Series Forecasting Model Based on the Adapted Transformer Neural Network and FFT-Based Features Extraction
    Yemets, Kyrylo
    Izonin, Ivan
    Dronyuk, Ivanna
    SENSORS, 2025, 25 (03)
  • [42] Clustering of Time Series Based on Forecasting Performance of Global Models
    Lopez-Oriona, Angel
    Montero-Manso, Pablo
    Vilar, Jose A.
    ADVANCED ANALYTICS AND LEARNING ON TEMPORAL DATA, AALTD 2022, 2023, 13812 : 18 - 33
  • [43] A Set of Time Series Forecasting Models Based on the Ordered Difference
    Wang, Hongxu
    Yin, Chengguo
    Lu, Xiaoli
    Feng, Hao
    Fu, Xiaofang
    PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON APPLIED MATHEMATICS, MODELING AND SIMULATION (AMMS 2017), 2017, 153 : 128 - 131
  • [44] A Comparison of Transformer-Based Language Models on NLP Benchmarks
    Greco, Candida Maria
    Tagarelli, Andrea
    Zumpano, Ester
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2022), 2022, 13286 : 490 - 501
  • [45] Long-term forecasting using transformer based on multiple time series
    Lee, Jaeyong
    Kim, Hyun Jun
    Lim, Changwon
    KOREAN JOURNAL OF APPLIED STATISTICS, 2024, 37 (05) : 583 - 598
  • [46] A transformer-based approach for early prediction of soybean yield using time-series images
    Bi, Luning
    Wally, Owen
    Hu, Guiping
    Tenuta, Albert U.
    Kandel, Yuba R.
    Mueller, Daren S.
    FRONTIERS IN PLANT SCIENCE, 2023, 14
  • [47] Are transformer-based models more robust than CNN-based models?
    Liu, Zhendong
    Qian, Shuwei
    Xia, Changhong
    Wang, Chongjun
    NEURAL NETWORKS, 2024, 172
  • [48] DCT-GAN: Dilated Convolutional Transformer-Based GAN for Time Series Anomaly Detection
    Li, Yifan
    Peng, Xiaoyan
    Zhang, Jia
    Li, Zhiyong
    Wen, Ming
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 3632 - 3644
  • [49] Transformer-Based Water Quality Forecasting With Dual Patch and Trend Decomposition
    Lin, Yongze
    Qiao, Junfei
    Bi, Jing
    Yuan, Haitao
    Wang, Mengyuan
    Zhang, Jia
    Zhou, MengChu
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (08): : 10987 - 10997
  • [50] GRU- and Transformer-Based Periodicity Fusion Network for Traffic Forecasting
    Zhang, Yazhe
    Liu, Shixuan
    Zhang, Ping
    Li, Bo
    ELECTRONICS, 2023, 12 (24)