Energy-Efficient Carrier Aggregation in 5G Using Constrained Multi-Agent MDP

被引:0
作者
Elsayed, Medhat [1 ,3 ]
Joda, Roghayeh [1 ,3 ]
Khoramnejad, Fahime [2 ]
Chan, David [1 ]
Sediq, Akram Bin [3 ]
Boudreau, Gary [3 ]
Erol-Kantarci, Melike [2 ]
机构
[1] Univ Ottawa, Sch Elect Engn & Comp Sci, Ottawa, ON K1N 6N5, Canada
[2] Univ Manitoba, Dept Elect & Comp Engn, Winnipeg, MB R3T 2N2, Canada
[3] Ericsson Canada, Dept Res & Dev, Ottawa, ON K2K 2V6, Canada
来源
IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING | 2024年 / 8卷 / 04期
基金
加拿大自然科学与工程研究理事会;
关键词
Throughput; Power demand; 5G mobile communication; Energy consumption; Q-learning; Heuristic algorithms; Energy efficiency; 5G; carrier activation/deactivation; component carrier manager; traffic splitting; reinforcement learning; constrained multi-agent MDP; RADIO RESOURCE-MANAGEMENT; DESIGN;
D O I
10.1109/TGCN.2024.3386066
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Carrier Aggregation (CA) is a promising technology in LTE and 5G networks that enhances the throughput of the users. However, since each User Equipment (UE) has to continuously monitor the activated Component Carriers (CCs) in CA, the UE energy consumption increases. To reduce the energy consumption while maximizing the throughput of UEs, we propose a dynamic and proactive CC management scheme for 5G, using a Q-Learning algorithm. To address our problem, we first model the corresponding Constrained Multi-agent Markov Decision Process (CMMDP) model and then utilize the Q-Learning algorithm to solve it. The time inter-arrival and the size of the next incoming bursts of data are proactively predicted and, along with the data in the buffer, are considered in the state space and the reward function of the machine learning model. Our proposed scheme is compared to three baseline schemes. In the first and second baseline algorithms, all CCs and only single CC are activated for each UE, respectively. For the last baseline algorithm, we simplify our Reinforcement Learning (RL) algorithm, in which the remaining data in the scheduling buffer of users is not considered and also the throughput and the number of activated CCs is balanced in the low traffic load. Simulation results reveal that our proposed Q-Learning algorithm outperforms the baselines. It achieves the same throughput as the all CC activation algorithm while reducing the UE power consumption by about 20%. These benefits are achieved by dynamically activating and deactivating CCs according to the UE traffic pattern.
引用
收藏
页码:1595 / 1606
页数:12
相关论文
共 50 条
[21]   Energy-Efficient DU-CU Deployment and Lightpath Provisioning for Service-Oriented 5G Metro Access/Aggregation Networks [J].
Xiao, Yuming ;
Zhang, Jiawei ;
Ji, Yuefeng .
JOURNAL OF LIGHTWAVE TECHNOLOGY, 2021, 39 (17) :5347-5361
[22]   Joint Energy-efficient and Throughput-sufficient Transmissions in 5G Cells with Deep Q-Learning [J].
Spantideas, Sotirios T. ;
Giannopoulos, Anastasios E. ;
Kapsalis, Nikolaos C. ;
Kalafatelis, Alexandros ;
Capsalis, Christos N. ;
Trakadas, Panagiotis .
2021 IEEE INTERNATIONAL MEDITERRANEAN CONFERENCE ON COMMUNICATIONS AND NETWORKING (IEEE MEDITCOM 2021), 2021, :265-270
[23]   Overview of 5G New Radio and Carrier Aggregation: 5G and Beyond Networks [J].
Nidhi ;
Mihovska, Albena ;
Prasad, Ramjee .
2020 23RD INTERNATIONAL SYMPOSIUM ON WIRELESS PERSONAL MULTIMEDIA COMMUNICATIONS (WPMC 2020), 2020,
[24]   DRL-Based Energy-Efficient Group Paging for Robust HTD Access Control in 5G Networks [J].
Park, Jaeeun ;
Lee, Joohyung ;
Choi, Jun Kyun .
IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (08) :10364-10387
[25]   Energy-Efficient Power Allocation in Multitier 5G Networks Using Enhanced Online Learning [J].
AlQerm, Ismail ;
Shihada, Basem .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2017, 66 (12) :11086-11097
[26]   Energy Efficient Backhauling for 5G Small Cell Networks [J].
Mowla, Md Munjure ;
Ahmad, Iftekhar ;
Habibi, Daryoush ;
Phung, Quoc Viet .
IEEE TRANSACTIONS ON SUSTAINABLE COMPUTING, 2019, 4 (03) :279-292
[27]   Deep Reinforcement Learning for Energy-Efficient Multi-Channel Transmissions in 5G Cognitive HetNets: Centralized, Decentralized and Transfer Learning Based Solutions [J].
Giannopoulos, Anastasios ;
Spantideas, Sotirios ;
Kapsalis, Nikolaos ;
Karkazis, Panagiotis ;
Trakadas, Panagiotis .
IEEE ACCESS, 2021, 9 :129358-129374
[28]   Fronthaul latency and capacity constrained cost-effective and energy-efficient 5G C-RAN deployment [J].
Akhtar, Md. Shahbaz ;
Gupta, Jitendra ;
Alam, Md. Iftekhar ;
Majhi, Sudhan ;
Adhya, Aneek .
OPTICAL FIBER TECHNOLOGY, 2023, 80
[29]   Multi-Agent Learning for Resource Allocationn Dense Heterogeneous 5G Network [J].
Bikov, Evgeni ;
Botvich, Dmitri .
PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON ENGINEERING AND TELECOMMUNICATION, EN&T 2015, 2015, :1-6
[30]   Renewable Energy Provision and Energy-Efficient Operational Management for Sustainable 5G Infrastructures [J].
Israr, Adil ;
Yang, Qiang ;
Israr, Ali .
IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (03) :2698-2710