Host load prediction in cloud computing with Discrete Wavelet Transformation (DWT) and Bidirectional Gated Recurrent Unit (BiGRU) network

被引:30
作者
Dogani, Javad [1 ]
Khunjush, Farshad [1 ]
Seydali, Mehdi [1 ]
机构
[1] Shiraz Univ, Sch Elect & Comp Engn, Dept Comp Sci & Engn & IT, Mollasadara St, Shiraz 7134851154, Iran
关键词
Cloud computing; Host load prediction; Deep learning; Discrete Wavelet Transformation (DWT); Bidirectional Gated-Recurrent Unit (BiGRU); RESOURCE-ALLOCATION; NEURAL-NETWORK; WORKLOAD; OPTIMIZATION; ALGORITHM; MODEL;
D O I
10.1016/j.comcom.2022.11.018
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Providing pay-as-you-go storage and computing services have contributed to the widespread adoption of cloud computing. Using virtualization technology, cloud service providers can execute several instances on a single physical server, maximizing resource utilization. A challenging issue in cloud data centers is that available resources are rarely fully utilized. The server utilization rate is poor and often below 30%. An accurate host workload prediction enhances resource allocation resulting in more efficient resource utilization. Recently, numerous methods based on deep learning for predicting cloud computing workload have been developed. An efficient strategy must predict long-term dependencies on nonstationary host workload data and be quick enough to respond to incoming requests. This study employs a Bidirectional Gated-Recurrent Unit (BiGRU), Discrete Wavelet Transformation (DWT), and an attention mechanism to improve the host load prediction accuracy. DWT is used to decompose input data into sub-bands with different frequencies and to extract patterns from nonlinear and nonstationary data in order to improve prediction accuracy. The extracted features are fed into BiGRu to predict future workload. The attention mechanism is used in order to extract the temporal correlation features. This hybrid model was evaluated with cluster data sets from Google and Alibaba. Experimental results reveal that our method improves prediction accuracy by 3% to 56% compared to a variety of state-of-the-art methods.
引用
收藏
页码:157 / 174
页数:18
相关论文
共 69 条
[1]   Workload Time Series Cumulative Prediction Mechanism for Cloud Resources Using Neural Machine Translation Technique [J].
Al-Sayed, Mustafa M. .
JOURNAL OF GRID COMPUTING, 2022, 20 (02)
[2]   Hybrid metaheuristic technique for optimal container resource allocation in cloud [J].
Alotaibi, Majid .
COMPUTER COMMUNICATIONS, 2022, 191 :477-485
[3]   Adaptive Prediction Models for Data Center Resources Utilization Estimation [J].
Baig, Shuja-ur-Rehman ;
Iqbal, Waheed ;
Berral, Josep Lluis ;
Erradi, Abdelkarim ;
Carrera, David .
IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2019, 16 (04) :1681-1693
[4]   A Forecasting Methodology for Workload Forecasting in Cloud Systems [J].
Baldan, Francisco J. ;
Ramirez-Gallego, Sergio ;
Bergmeir, Christoph ;
Herrera, Francisco ;
Benitez, Jose M. .
IEEE TRANSACTIONS ON CLOUD COMPUTING, 2018, 6 (04) :929-941
[5]   Integrated deep learning method for workload and resource prediction cloud systems [J].
Bi, Jing ;
Li, Shuang ;
Yuan, Haitao ;
Zhou, MengChu .
NEUROCOMPUTING, 2021, 424 :35-48
[6]   Workload Prediction Using ARIMA Model and Its Impact on Cloud Applications' QoS [J].
Calheiros, Rodrigo N. ;
Masoumi, Enayat ;
Ranjan, Rajiv ;
Buyya, Rajkumar .
IEEE TRANSACTIONS ON CLOUD COMPUTING, 2015, 3 (04) :449-458
[7]   Interactive Temporal Recurrent Convolution Network for Traffic Predictionin Data Centers [J].
Cao, Xiaofeng ;
Zhong, Yuhua ;
Zhou, Yun ;
Wang, Jiang ;
Zhu, Cheng ;
Zhang, Weiming .
IEEE ACCESS, 2018, 6 :5276-5289
[8]   TOPSIS inspired Budget and Deadline Aware Multi-Workflow Scheduling for Cloud [J].
Chakravarthi, Koneti Kalyan ;
Shyamala, L. .
JOURNAL OF SYSTEMS ARCHITECTURE, 2021, 114
[9]   A Hybrid Method for Short-Term Host Utilization Prediction in Cloud Computing [J].
Chen, Jing ;
Wang, Yinglong .
JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2019, 2019
[10]  
Djennane N, 2021, INT J SENSORS WIREL, V11, P733, DOI [10.2174/2210327910666201216123246, DOI 10.2174/2210327910666201216123246]