QoS-Aware Joint Offloading and Power Control Using Deep Reinforcement Learning in MEC

被引:2
|
作者
Li, Xiang [1 ]
Chen, Yu [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Natl Engn Lab Mobile Network Technol, Beijing, Peoples R China
来源
2020 23RD INTERNATIONAL SYMPOSIUM ON WIRELESS PERSONAL MULTIMEDIA COMMUNICATIONS (WPMC 2020) | 2020年
基金
中国国家自然科学基金;
关键词
EFFECTIVE CAPACITY; MOBILE;
D O I
10.1109/wpmc50192.2020.9309513
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The mobile edge computing (MEC) relieves resource-constrained mobile devices from computation intensive tasks. However, it is difficult to design a joint offloading and power control method that minimizes the delay and the power consumption (including the local execution power and the transmission power). In this paper, we propose a two-step method solve the above problem. In the first step, we propose a QoS driven offloading strategy to minimize the queueing delay. In the second step, we apply a deep deterministic policy gradient (DDPG) method for power control. Simulation results show that our proposed framework achieves lower overall delay and energy consumption than existing methods.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] QoS-Aware Task Offloading in Fog Environment Using Multi-agent Deep Reinforcement Learning
    Vibha Jain
    Bijendra Kumar
    Journal of Network and Systems Management, 2023, 31
  • [2] QoS-Aware Task Offloading in Fog Environment Using Multi-agent Deep Reinforcement Learning
    Jain, Vibha
    Kumar, Bijendra
    JOURNAL OF NETWORK AND SYSTEMS MANAGEMENT, 2023, 31 (01)
  • [3] QoS-Aware Machine Learning Task Offloading and Power Control in Internet of Drones
    Yao, Jingjing
    Ansari, Nirwan
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (07) : 6100 - 6110
  • [4] QoS-Aware Scheduling in New Radio Using Deep Reinforcement Learning
    Stigenberg, Jakob
    Saxena, Vidit
    Tayamon, Soma
    Ghadimi, Euhanna
    2021 IEEE 32ND ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2021,
  • [5] QoS-Aware Power Management with Deep Learning
    Zhou, Junxiu
    Liu, Xian
    Tao, Yangyang
    Yu, Shucheng
    2019 IFIP/IEEE SYMPOSIUM ON INTEGRATED NETWORK AND SERVICE MANAGEMENT (IM), 2019, : 289 - 294
  • [6] QOS-AWARE FLOW CONTROL FOR POWER-EFFICIENT DATA CENTER NETWORKS WITH DEEP REINFORCEMENT LEARNING
    Sun, Penghao
    Guo, Zehua
    Liu, Sen
    Lan, Julong
    Hu, Yuxiang
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3552 - 3556
  • [7] μ-DDRL: A QoS-Aware Distributed Deep Reinforcement Learning Technique for Service Offloading in Fog Computing Environments
    Goudarzi M.
    Rodriguez M.A.
    Sarvi M.
    Buyya R.
    IEEE Transactions on Services Computing, 2024, 17 (01): : 47 - 59
  • [8] Joint Offloading, Communication and Collaborative Computation Using Deep Reinforcement Learning in MEC Networks
    Nie, Xuefang
    Chen, Xingbang
    Zhang, DingDing
    Zhou, Tianqing
    Zhang, Jiliang
    2023 IEEE/CIC International Conference on Communications in China, ICCC Workshops 2023, 2023,
  • [9] IQoR: An Intelligent QoS-aware Routing Mechanism with Deep Reinforcement Learning
    Cao, Yuanyuan
    Dai, Bin
    Mo, Yijun
    Xu, Yang
    PROCEEDINGS OF THE 2020 IEEE 45TH CONFERENCE ON LOCAL COMPUTER NETWORKS (LCN 2020), 2020, : 329 - 332
  • [10] Deep Adversarial Imitation Reinforcement Learning for QoS-Aware Cloud Job Scheduling
    Huang, Yifeng
    Cheng, Long
    Xue, Lianting
    Liu, Cong
    Li, Yuancheng
    Li, Jianbin
    Ward, Tomas
    IEEE SYSTEMS JOURNAL, 2022, 16 (03): : 4232 - 4242