Host load prediction in cloud computing with Discrete Wavelet Transformation (DWT) and Bidirectional Gated Recurrent Unit (BiGRU) network

被引:35
作者
Dogani, Javad [1 ]
Khunjush, Farshad [1 ]
Seydali, Mehdi [1 ]
机构
[1] Shiraz Univ, Sch Elect & Comp Engn, Dept Comp Sci & Engn & IT, Mollasadara St, Shiraz 7134851154, Iran
关键词
Cloud computing; Host load prediction; Deep learning; Discrete Wavelet Transformation (DWT); Bidirectional Gated-Recurrent Unit (BiGRU); RESOURCE-ALLOCATION; NEURAL-NETWORK; WORKLOAD; OPTIMIZATION; ALGORITHM; MODEL;
D O I
10.1016/j.comcom.2022.11.018
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Providing pay-as-you-go storage and computing services have contributed to the widespread adoption of cloud computing. Using virtualization technology, cloud service providers can execute several instances on a single physical server, maximizing resource utilization. A challenging issue in cloud data centers is that available resources are rarely fully utilized. The server utilization rate is poor and often below 30%. An accurate host workload prediction enhances resource allocation resulting in more efficient resource utilization. Recently, numerous methods based on deep learning for predicting cloud computing workload have been developed. An efficient strategy must predict long-term dependencies on nonstationary host workload data and be quick enough to respond to incoming requests. This study employs a Bidirectional Gated-Recurrent Unit (BiGRU), Discrete Wavelet Transformation (DWT), and an attention mechanism to improve the host load prediction accuracy. DWT is used to decompose input data into sub-bands with different frequencies and to extract patterns from nonlinear and nonstationary data in order to improve prediction accuracy. The extracted features are fed into BiGRu to predict future workload. The attention mechanism is used in order to extract the temporal correlation features. This hybrid model was evaluated with cluster data sets from Google and Alibaba. Experimental results reveal that our method improves prediction accuracy by 3% to 56% compared to a variety of state-of-the-art methods.
引用
收藏
页码:157 / 174
页数:18
相关论文
共 69 条
[11]   A Hybrid Method for Short-Term Host Utilization Prediction in Cloud Computing [J].
Chen, Jing ;
Wang, Yinglong .
JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2019, 2019
[12]  
Djennane N, 2021, INT J SENSORS WIREL, V11, P733, DOI [10.2174/2210327910666201216123246, DOI 10.2174/2210327910666201216123246]
[13]   Multivariate workload and resource prediction in cloud computing using CNN and GRU by attention mechanism [J].
Dogani, Javad ;
Khunjush, Farshad ;
Mahmoudi, Mohammad Reza ;
Seydali, Mehdi .
JOURNAL OF SUPERCOMPUTING, 2023, 79 (03) :3437-3470
[14]   Assessment of deep recurrent neural network-based strategies for short-term building energy predictions [J].
Fan, Cheng ;
Wang, Jiayuan ;
Gang, Wenjie ;
Li, Shenghan .
APPLIED ENERGY, 2019, 236 :700-710
[15]   Z-Score Normalization, Hubness, and Few-Shot Learning [J].
Fei, Nanyi ;
Gao, Yizhao ;
Lu, Zhiwu ;
Xiang, Tao .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :142-151
[16]  
Feltane A, 2016, TIME FREQUENCY BASED
[17]   Short-Term Traffic Speed Prediction Method for Urban Road Sections Based on Wavelet Transform and Gated Recurrent Unit [J].
Fu, Xin ;
Luo, Wei ;
Xu, Chengyao ;
Zhao, Xiaoxuan .
MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
[18]   Proactive auto-scaling for cloud environments using temporal convolutional neural networks [J].
Golshani, Ehsan ;
Ashtiani, Mehrdad .
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2021, 154 :119-141
[19]   A comparative study of neural network and Box-Jenkins ARIMA modeling in time series prediction [J].
Ho, SL ;
Xie, M ;
Goh, TN .
COMPUTERS & INDUSTRIAL ENGINEERING, 2002, 42 (2-4) :371-375
[20]   Host load prediction in cloud computing using Long Short-Term Memory Encoder-Decoder [J].
Hoang Minh Nguyen ;
Kalra, Gaurav ;
Kim, Daeyoung .
JOURNAL OF SUPERCOMPUTING, 2019, 75 (11) :7592-7605