Graph Deep Factors for Probabilistic Time-series Forecasting

被引:3
作者
Chen, Hongjie [1 ]
Rossi, Ryan A. [2 ]
Mahadik, Kanak [2 ]
Kim, Sungchul [2 ]
Eldardiry, Hoda [1 ]
机构
[1] Virginia Tech, Blacksburg, VA 24061 USA
[2] Adobe Res, San Jose, CA USA
关键词
Incremental online learning; Graph Neural Network; time-series forecasting; WORKLOAD PREDICTION; NEURAL-NETWORK;
D O I
10.1145/3543511
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Effective time-series forecasting methods are of significant importance to solve a broad spectrum of research problems. Deep probabilistic forecasting techniques have recently been proposed for modeling large collections of time-series. However, these techniques explicitly assume either complete independence ( local model) or complete dependence (global model) between time-series in the collection. This corresponds to the two extreme cases where every time-series is disconnected from every other time-series in the collection or likewise, that every time-series is related to every other time-series resulting in a completely connected graph. In this work, we propose a deep hybrid probabilistic graph-based forecasting framework called Graph Deep Factors (GraphDF) that goes beyond these two extremes by allowing nodes and their time-series to be connected to others in an arbitrary fashion. GraphDF is a hybrid forecasting framework that consists of a relational global and relational local model. In particular, a relational global model learns complex non-linear time-series patterns globally using the structure of the graph to improve both forecasting accuracy and computational efficiency. Similarly, instead of modeling every time-series independently, a relational local model not only considers its individual time-series but also the time-series of nodes that are connected in the graph. The experiments demonstrate the effectiveness of the proposed deep hybrid graph-based forecasting model compared to the state-of-the-art methods in terms of its forecasting accuracy, runtime, and scalability. Our case study reveals that GraphDF can successfully generate cloud usage forecasts and opportunistically schedule workloads to increase cloud cluster utilization by 47.5% on average. Furthermore, we target addressing the common nature of many time-series forecasting applications where time-series are provided in a streaming version; however, most methods fail to leverage the newly incoming time-series values and result in worse performance over time. In this article, we propose an online incremental learning framework for probabilistic forecasting. The framework is theoretically proven to have lower time and space complexity. The framework can be universally applied to many other machine learning-based methods.
引用
收藏
页数:30
相关论文
共 88 条
[1]   An Empirical Comparison of Machine Learning Models for Time Series Forecasting [J].
Ahmed, Nesreen K. ;
Atiya, Amir F. ;
El Gayar, Neamat ;
El-Shishiny, Hisham .
ECONOMETRIC REVIEWS, 2010, 29 (5-6) :594-621
[2]  
Alexandrov A, 2020, J MACH LEARN RES, V21
[3]  
Anava O., 2013, C LEARNING THEORY, P172
[4]  
[Anonymous], 2011, P 2 ACM S CLOUD COMP, DOI DOI 10.1145/2038916.2038921
[5]  
[Anonymous], 1990, Forecasting, structural time series models and the kalman filter
[6]  
Bai SJ, 2018, Arxiv, DOI [arXiv:1803.01271, 10.48550/arXiv.1803.01271]
[7]   Automatic neural network modeling for univariate time series [J].
Balkin, SD ;
Ord, JK .
INTERNATIONAL JOURNAL OF FORECASTING, 2000, 16 (04) :509-515
[8]   LSTM-MSNet: Leveraging Forecasts on Sets of Related Time Series With Multiple Seasonal Patterns [J].
Bandara, Kasun ;
Bergmeir, Christoph ;
Hewamalage, Hansika .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (04) :1586-1599
[9]   Multiple-output modeling for multi-step-ahead time series forecasting [J].
Ben Taieb, Souhaib ;
Sorjamaa, Antti ;
Bontempi, Gianluca .
NEUROCOMPUTING, 2010, 73 (10-12) :1950-1957
[10]  
Benidis K, 2022, Arxiv, DOI arXiv:2004.10240