WorkloadDiff: Conditional Denoising Diffusion Probabilistic Models for Cloud Workload Prediction

被引:2
作者
Zheng, Weiping [1 ]
Chen, Zongxiao [1 ]
Zheng, Kaiyuan [1 ]
Zheng, Weijian [1 ]
Chen, Yiqi [1 ]
Fan, Xiaomao [2 ]
机构
[1] South China Normal Univ, Sch Comp Sci, Guangzhou 510630, Peoples R China
[2] Shenzhen Technol Univ, Coll Big Data & Internet, Shenzhen 518122, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Predictive models; Cloud computing; Diffusion models; Time series analysis; Data models; Hidden Markov models; Forecasting; Cloud workload prediction; diffusion models; resource management; resampling; ARIMA;
D O I
10.1109/TCC.2024.3461649
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Accurate workload forecasting plays a crucial role in optimizing resource allocation, enhancing performance, and reducing energy consumption in cloud data centers. Deep learning-based methods have emerged as the dominant approach in this field, exhibiting exceptional performance. However, most existing methods lack the ability to quantify confidence, limiting their practical decision-making utility. To address this limitation, we propose a novel denoising diffusion probabilistic model (DDPM)-based method, termed WorkloadDiff, for multivariate probabilistic workload prediction. WorkloadDiff leverages both original and noisy signals from input conditions using a two-path neural network. Additionally, we introduce a multi-scale feature extraction method and an adaptive fusion approach to capture diverse temporal patterns within the workload. To enhance consistency between conditions and predicted values, we incorporate a resampling strategy into the inference of WorkloadDiff. Extensive experiments conducted on four public datasets demonstrate the superior performance of WorkloadDiff over all baseline models, establishing it as a robust tool for resource management in cloud data centers.
引用
收藏
页码:1291 / 1304
页数:14
相关论文
共 51 条
[21]  
Kong Zhifeng, 2021, P ICLR
[22]   Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks [J].
Lai, Guokun ;
Chang, Wei-Cheng ;
Yang, Yiming ;
Liu, Hanxiao .
ACM/SIGIR PROCEEDINGS 2018, 2018, :95-104
[23]  
Liu ST, 2019, Arxiv, DOI arXiv:1911.09516
[24]   RePaint: Inpainting using Denoising Diffusion Probabilistic Models [J].
Lugmayr, Andreas ;
Danelljan, Martin ;
Romero, Andres ;
Yu, Fisher ;
Timofte, Radu ;
Van Gool, Luc .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :11451-11461
[25]   Characterizing Microservice Dependency and Performance: Alibaba Trace Analysis [J].
Luo, Shutian ;
Xu, Huanle ;
Lu, Chengzhi ;
Ye, Kejiang ;
Xu, Guoyao ;
Zhang, Liping ;
Ding, Yu ;
He, Jian ;
Xu, Chengzhong .
PROCEEDINGS OF THE 2021 ACM SYMPOSIUM ON CLOUD COMPUTING (SOCC '21), 2021, :412-426
[26]   Efficient VM migrations using forecasting techniques in cloud computing: a comprehensive review [J].
Masdari, Mohammad ;
Khezri, Hemn .
CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2020, 23 (04) :2629-2658
[27]   A survey and classification of the workload forecasting methods in cloud computing [J].
Masdari, Mohammad ;
Khoshnevis, Afsane .
CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2020, 23 (04) :2399-2424
[28]  
Oreshkin Boris N., 2020, P INT C LEARN REPR J, DOI DOI 10.48550/ARXIV.1905.10437
[29]  
Rasul K, 2021, PR MACH LEARN RES, V139
[30]  
Reiss C., 2011, GOOGLE CLUSTER USAGE