Energy-Efficient Ultra-Dense Network With Deep Reinforcement Learning

被引:32
作者
Ju, Hyungyu [1 ,2 ]
Kim, Seungnyun [1 ,2 ]
Kim, Youngjoon [3 ]
Shim, Byonghyo [1 ,2 ]
机构
[1] Seoul Natl Univ, Inst New Media & Commun, Seoul 08826, South Korea
[2] Seoul Natl Univ, Dept Elect & Comp Engn, Seoul 08826, South Korea
[3] Samsung Res, Adv Commun Res Ctr, Seoul 06765, South Korea
基金
新加坡国家研究基金会;
关键词
Energy consumption; Power demand; Fading channels; Ultra-dense networks; Throughput; Reinforcement learning; Downlink; Wireless resource management; deep reinforcement learning (DRL); ultra-dense networks (UDNs); energy efficiency;
D O I
10.1109/TWC.2022.3150425
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the explosive growth in mobile data traffic, ultra-dense network (UDN) where a large number of small cells are densely deployed on top of macro cells has received a great deal of attention in recent years. While UDN offers a number of benefits, an upsurge of energy consumption in UDN due to the intensive deployment of small cells has now become a major bottleneck in achieving the primary goals viz., 100-fold increase in the throughput in 5G+ and 6G. In recent years, an approach to reduce the energy consumption of base stations (BSs) by selectively turning off the lightly-loaded BSs, referred to as the sleep mode technique, has been suggested. However, determining an appropriate active/sleep modes of BSs is a difficult task due to the huge computational overhead and inefficiency caused by the frequent BS mode conversion. An aim of this paper is to propose a deep reinforcement learning (DRL)-based approach to achieve a reduction of energy consumption in UDN. Key ingredient of the proposed scheme is to use decision selection network to reduce the size of action space. Numerical results show that the proposed scheme can significantly reduce the energy consumption of UDN while ensuring the rate requirement of network.
引用
收藏
页码:6539 / 6552
页数:14
相关论文
共 30 条
[1]   Convex Optimization: Algorithms and Complexity [J].
不详 .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2015, 8 (3-4) :232-+
[2]  
Burer S., 2012, Surv. Oper. Res. Manag. Sci., V17, P97, DOI 10.1016/j.sorms.2012.08.001
[3]   USER-CENTRIC ULTRA-DENSE NETWORKS FOR 5G: CHALLENGES, METHODOLOGIES, AND DIRECTIONS [J].
Chen, Shanzhi ;
Qin, Fei ;
Hu, Bo ;
Li, Xi ;
Chen, Zhonglin .
IEEE WIRELESS COMMUNICATIONS, 2016, 23 (02) :78-85
[4]  
Chinchali S, 2018, AAAI CONF ARTIF INTE, P766
[5]  
CONTE Alberto, 2012, PROC TREND PLENARY M
[6]  
Debaillie B, 2015, IEEE VTS VEH TECHNOL
[7]  
Ge XH, 2016, IEEE WIREL COMMUN, V23, P72, DOI 10.1109/MWC.2016.7422408
[8]   Spatial Spectrum and Energy Efficiency of Random Cellular Networks [J].
Ge, Xiaohu ;
Yang, Bin ;
Ye, Junliang ;
Mao, Guoqiang ;
Wang, Cheng-Xiang ;
Han, Tao .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2015, 63 (03) :1019-1030
[9]  
Grant M., 2009, Cvx user guide
[10]   Green Radio: Radio Techniques to Enable Energy-Efficient Wireless Networks [J].
Han, Congzheng ;
Harrold, Tim ;
Armour, Simon ;
Krikidis, Ioannis ;
Videv, Stefan ;
Grant, Peter M. ;
Haas, Harald ;
Thompson, John S. ;
Ku, Ivan ;
Wang, Cheng-Xiang ;
Tuan Anh Le ;
Nakhai, M. Reza ;
Zhang, Jiayi ;
Hanzo, Lajos .
IEEE COMMUNICATIONS MAGAZINE, 2011, 49 (06) :46-54