Energy-efficient Incremental Offloading of Neural Network Computations in Mobile Edge Computing

被引:2
作者
Guo, Guangfeng [1 ,2 ]
Zhang, Junxing [1 ]
机构
[1] Inner Mongolia Univ, Coll Comp Sci, Hohhot, Peoples R China
[2] Inner Mongolia Univ Sci & Technol, Baotou Teachers Coll, Baotou, Peoples R China
来源
2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM) | 2020年
基金
中国国家自然科学基金;
关键词
Mobile Edge Computing; Deep Neural Network; Computation Offloading; Energy Efficient;
D O I
10.1109/GLOBECOM42002.2020.9322504
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep Neural Network (DNN) has shown remarkable success in Computer Vision and Augmented Reality. However, battery-powered devices still cannot afford to run state-of-the-art DNNs. Mobile Edge Computing (MEC) is a promising approach to run the DNNs on energy-constrained mobile devices. It uploads the DNN model partitions of the devices to the nearest edge servers on demand, and then offloads DNN computations to the servers to save the energy of the devices. Nevertheless, the existing all-at-once computation offloading faces two great challenges. The first one is how to find the most energy-efficient model partition scheme under different wireless network bandwidths in MEC. The second challenge is how to reduce the time and energy cost of the devices waiting for the servers, since uploading all DNN layers of the optimal partition often lakes time. To meet these challenges, we propose the following solution. First, we build regression-based energy consumption prediction models by profiling the energy consumption of mobile devices under varied wireless network bandwidths. Then, we present an algorithm that finds the most energy-efficient DNN partition scheme based on the established prediction models and performs incremental computation offloading upon the completion of uploading each DNN partition. The experimental results show that our solution improves energy efficiency compared to the current all-at-once approach. Under the 100 Mbps bandwidth, when the model uploading takes 1/3 of the total uploading time, the proposed solution can reduce the energy consumption by around 40%.
引用
收藏
页数:6
相关论文
共 50 条
[21]   Energy-Efficient Mobile Edge Hosts for Mobile Edge Computing System [J].
Thananjeyan, Shanmuganathan ;
Chan, Chien Aun ;
Wong, Elaine ;
Nirmalathas, Ampalavanapillai .
2018 IEEE 9TH INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION FOR SUSTAINABILITY (ICIAFS' 2018), 2018,
[22]   Energy-efficient offloading framework for mobile edge/cloud computing based on convex optimization and Deep Q-Network [J].
Madiyev, Askar ;
Bulegenov, Daulet ;
Karzhaubayev, Anuar ;
Murzabulatov, Meiram ;
Bui, Dinh Mao .
JOURNAL OF SUPERCOMPUTING, 2025, 81 (11)
[23]   Energy-efficient allocation for multiple tasks in mobile edge computing [J].
Jun Liu ;
Xi Liu .
Journal of Cloud Computing, 11
[24]   Energy-efficient computation offloading and resource allocation for delay-sensitive mobile edge computing [J].
Wang, Quyuan ;
Guo, Songtao ;
Liu, Jiadi ;
Yan, Yuanyuan .
SUSTAINABLE COMPUTING-INFORMATICS & SYSTEMS, 2019, 21 :154-164
[25]   Energy-efficient allocation for multiple tasks in mobile edge computing [J].
Liu, Jun ;
Liu, Xi .
JOURNAL OF CLOUD COMPUTING-ADVANCES SYSTEMS AND APPLICATIONS, 2022, 11 (01)
[26]   A Q-learning based Method for Energy-Efficient Computation Offloading in Mobile Edge Computing [J].
Jiang, Kai ;
Zhou, Huan ;
Li, Dawei ;
Liu, Xuxun ;
Xu, Shouzhi .
2020 29TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATIONS AND NETWORKS (ICCCN 2020), 2020,
[27]   Energy-Efficient Computation Offloading in Collaborative Edge Computing [J].
Lin, Rongping ;
Xie, Tianze ;
Luo, Shan ;
Zhang, Xiaoning ;
Xiao, Yong ;
Moran, Bill ;
Zukerman, Moshe .
IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (21) :21305-21322
[28]   Energy-Efficient Secure Computation Offloading in Wireless Powered Mobile Edge Computing Systems [J].
Wu, Mengru ;
Song, Qingyang ;
Guo, Lei ;
Lee, Inkyu .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2023, 72 (05) :6907-6912
[29]   Energy-Efficient Offloading for Mobile Edge Computing in 5G Heterogeneous Networks [J].
Zhang, Ke ;
Mao, Yuming ;
Leng, Supeng ;
Zhao, Quanxin ;
Li, Longjiang ;
Peng, Xin ;
Pan, Li ;
Maharjan, Sabita ;
Zhang, Yan .
IEEE ACCESS, 2016, 4 :5896-5907
[30]   Deep Reinforcement Learning for Energy-Efficient Computation Offloading in Mobile-Edge Computing [J].
Zhou, Huan ;
Jiang, Kai ;
Liu, Xuxun ;
Li, Xiuhua ;
Leung, Victor C. M. .
IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (02) :1517-1530