Energy-efficient Incremental Offloading of Neural Network Computations in Mobile Edge Computing

被引:2
作者
Guo, Guangfeng [1 ,2 ]
Zhang, Junxing [1 ]
机构
[1] Inner Mongolia Univ, Coll Comp Sci, Hohhot, Peoples R China
[2] Inner Mongolia Univ Sci & Technol, Baotou Teachers Coll, Baotou, Peoples R China
来源
2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM) | 2020年
基金
中国国家自然科学基金;
关键词
Mobile Edge Computing; Deep Neural Network; Computation Offloading; Energy Efficient;
D O I
10.1109/GLOBECOM42002.2020.9322504
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep Neural Network (DNN) has shown remarkable success in Computer Vision and Augmented Reality. However, battery-powered devices still cannot afford to run state-of-the-art DNNs. Mobile Edge Computing (MEC) is a promising approach to run the DNNs on energy-constrained mobile devices. It uploads the DNN model partitions of the devices to the nearest edge servers on demand, and then offloads DNN computations to the servers to save the energy of the devices. Nevertheless, the existing all-at-once computation offloading faces two great challenges. The first one is how to find the most energy-efficient model partition scheme under different wireless network bandwidths in MEC. The second challenge is how to reduce the time and energy cost of the devices waiting for the servers, since uploading all DNN layers of the optimal partition often lakes time. To meet these challenges, we propose the following solution. First, we build regression-based energy consumption prediction models by profiling the energy consumption of mobile devices under varied wireless network bandwidths. Then, we present an algorithm that finds the most energy-efficient DNN partition scheme based on the established prediction models and performs incremental computation offloading upon the completion of uploading each DNN partition. The experimental results show that our solution improves energy efficiency compared to the current all-at-once approach. Under the 100 Mbps bandwidth, when the model uploading takes 1/3 of the total uploading time, the proposed solution can reduce the energy consumption by around 40%.
引用
收藏
页数:6
相关论文
共 50 条
  • [11] Energy-Efficient NOMA-Based Mobile Edge Computing Offloading
    Pan, Yijin
    Chen, Ming
    Yang, Zhaohui
    Huang, Nuo
    Shikh-Bahaei, Mohammad
    IEEE COMMUNICATIONS LETTERS, 2019, 23 (02) : 310 - 313
  • [12] Energy-efficient offloading decision-making for mobile edge computing in vehicular networks
    Xiaoge Huang
    Ke Xu
    Chenbin Lai
    Qianbin Chen
    Jie Zhang
    EURASIP Journal on Wireless Communications and Networking, 2020
  • [13] Energy-Efficient Task Offloading Using Dynamic Voltage Scaling in Mobile Edge Computing
    Li, Song
    Sun, Weibin
    Sun, Yanjing
    Huo, Yu
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (01): : 588 - 598
  • [14] Energy-efficient offloading decision-making for mobile edge computing in vehicular networks
    Huang, Xiaoge
    Xu, Ke
    Lai, Chenbin
    Chen, Qianbin
    Zhang, Jie
    EURASIP JOURNAL ON WIRELESS COMMUNICATIONS AND NETWORKING, 2020, 2020 (01)
  • [15] Joint Task Offloading and Cache Placement for Energy-Efficient Mobile Edge Computing Systems
    Liang, Jingxuan
    Xing, Hong
    Wang, Feng
    Lau, Vincent K. N.
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2023, 12 (04) : 694 - 698
  • [16] Energy-efficient computation offloading strategy with tasks scheduling in edge computing
    Zhang, Yue
    Fu, Jingqi
    WIRELESS NETWORKS, 2021, 27 (01) : 609 - 620
  • [17] Energy-efficient computation offloading strategy with tasks scheduling in edge computing
    Yue Zhang
    Jingqi Fu
    Wireless Networks, 2021, 27 : 609 - 620
  • [18] Energy-Efficient Task Caching and Offloading Strategy in Mobile Edge Computing Systems
    Chen, Qian
    Liu, Zhoubin
    Ruan, Linna
    Wang, Zixiang
    Shao, Sujie
    Qi, Feng
    SECURITY WITH INTELLIGENT COMPUTING AND BIG-DATA SERVICES, 2020, 895 : 824 - 837
  • [19] Energy-efficient Offloading Policy for Resource Allocation in Distributed Mobile Edge Computing
    Wang, Chang
    Dong, Chongwu
    Qin, Jinghui
    Yang, Xiaoxing
    Wen, Wushao
    2018 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (ISCC), 2018, : 371 - 377
  • [20] Energy-Efficient Heuristic Computation Offloading With Delay Constraints in Mobile Edge Computing
    Mei, Jing
    Tong, Zhao
    Li, Kenli
    Zhang, Lianming
    Li, Keqin
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (06) : 4404 - 4417