Energy-Efficient Multi-Agent Deep Reinforcement Learning Task Offloading and Resource Allocation for UAV Edge Computing

被引:0
作者
Xu, Shu [1 ]
Liu, Qingjie [1 ]
Gong, Chengye [1 ]
Wen, Xupeng [1 ]
机构
[1] China Nanhu Acad Elect & Informat Technol, Jiaxing 314001, Peoples R China
基金
中国国家自然科学基金;
关键词
energy-efficient; unmanned aerial vehicles; task offloading; resource allocation; deep reinforcement learning; multi-agent systems; DELAY OPTIMIZATION; MINIMIZATION; MANAGEMENT; MEC; NETWORKS;
D O I
10.3390/s25113403
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The integration of Unmanned Aerial Vehicles (UAVs) into Mobile Edge Computing (MEC) systems has emerged as a transformative solution for latency-sensitive applications, leveraging UAVs' unique advantages in mobility, flexible deployment, and on-demand service provisioning. This paper proposes a novel multi-agent reinforcement learning framework, termed Multi-Agent Twin Delayed Deep Deterministic Policy Gradient for Task Offloading and Resource Allocation (MATD3-TORA), to optimize task offloading and resource allocation in UAV-assisted MEC networks. The framework enables collaborative decision making among multiple UAVs to efficiently serve sparsely distributed ground mobile devices (MDs) and establish an integrated mobility, communication, and computational offloading model, which formulates a joint optimization problem aimed at minimizing the weighted sum of task processing latency and UAV energy consumption. Extensive experiments demonstrate that the algorithm achieves improvements in system latency and energy efficiency compared to conventional approaches. The results highlight MATD3-TORA's effectiveness in addressing UAV-MEC challenges, including mobility-energy tradeoffs, distributed decision making, and real-time resource allocation.
引用
收藏
页数:24
相关论文
共 47 条
[1]   A Survey on Mobility of Edge Computing Networks in IoT: State-of-the-Art, Architectures, and Challenges [J].
Abkenar, Forough Shirin ;
Ramezani, Parisa ;
Iranmanesh, Saeid ;
Murali, Sarumathi ;
Chulerttiyawong, Donpiti ;
Wan, Xinyu ;
Jamalipour, Abbas ;
Raad, Raad .
IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2022, 24 (04) :2329-2365
[2]   Deep-EERA: DRL-Based Energy-Efficient Resource Allocation in UAV-Empowered Beyond 5G Networks [J].
Ahmad, Shabeer ;
Zhang, Jinling ;
Nauman, Ali ;
Khan, Adil ;
Abbas, Khizar ;
Hayat, Babar .
TSINGHUA SCIENCE AND TECHNOLOGY, 2025, 30 (01) :418-432
[3]   Internet of Things: A Survey on Enabling Technologies, Protocols, and Applications [J].
Al-Fuqaha, Ala ;
Guizani, Mohsen ;
Mohammadi, Mehdi ;
Aledhari, Mohammed ;
Ayyash, Moussa .
IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2015, 17 (04) :2347-2376
[4]   Delay-Optimal Task Offloading for UAV-Enabled Edge-Cloud Computing Systems [J].
Almutairi, Jaber ;
Aldossary, Mohammad ;
Alharbi, Hatem A. ;
Yosuf, Barzan A. ;
Elmirghani, Jaafar M. H. .
IEEE ACCESS, 2022, 10 :51575-51586
[5]  
Amin M.R., 2023, Int. J. Inf. Syst. Comput. Technol, V2, P44, DOI [10.58325/ijisct.002.02.0050, DOI 10.58325/IJISCT.002.02.0050]
[6]   Deadline-aware and energy-efficient IoT task scheduling in fog computing systems: A semi-greedy approach [J].
Azizi, Sadoon ;
Shojafar, Mohammad ;
Abawajy, Jemal ;
Buyya, Rajkumar .
JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2022, 201
[7]  
Cao X., 2018, P 2018 IEEE 19 INT W, P1
[8]   A game-based deep reinforcement learning approach for energy-efficient computation in MEC systems [J].
Chen, Miaojiang ;
Liu, Wei ;
Wang, Tian ;
Zhang, Shaobo ;
Liu, Anfeng .
KNOWLEDGE-BASED SYSTEMS, 2022, 235
[9]   Energy-Efficient Resource Allocation in Multi-UAV-Assisted Two-Stage Edge Computing for Beyond 5G Networks [J].
Ei, Nway Nway ;
Alsenwi, Madyan ;
Tun, Yan Kyaw ;
Han, Zhu ;
Hong, Choong Seon .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (09) :16421-16432
[10]   A Distributed Deep Reinforcement Learning Technique for Application Placement in Edge and Fog Computing Environments [J].
Goudarzi, Mohammad ;
Palaniswami, Marimuthu ;
Buyya, Rajkumar .
IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (05) :2491-2505