Energy-efficient Federated Learning via Stabilization-aware On-device Update Scaling

被引:1
作者
Xu, Suwei [1 ]
Jin, Yibo [1 ]
Qian, Zhuzhong [1 ]
Zhang, Sheng [1 ]
Zhao, Ming [2 ]
Lin, Zhenjie [2 ]
Lin, Qiang [2 ]
Wang, Liming [2 ]
机构
[1] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] China Southern Power Grid Shenzhen Digital Power, Guangzhou, Peoples R China
来源
2022 19TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING (SECON) | 2022年
基金
美国国家科学基金会;
关键词
OPTIMIZATION; EDGE;
D O I
10.1109/SECON55815.2022.9918541
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is emerging as a major learning paradigm, which enables multiple devices to train a model collaboratively and to keep the privacy of data. However, substantial computation-intensive iterations are performed on devices before the training completion, which incurs heavy consumption of the energy. Along with the stabilization of those model parameters being trained, such on-device training iterations are redundant gradually over time. Thus, we propose to scale the update results obtained from reduced iterations as the substitute for on-device training, based on current model status and device heterogeneity. We thus formulate a time-varying integer program, to minimize cumulative energy consumption over devices, subject to a longterm constraint regarding the model convergence. We then design a polynomial-time online algorithm upon system dynamics, which essentially balances the energy consumption and the model quality being trained. Via rigorous proofs, our approach only incurs sublinear regret, compared with its optimum, and ensures related model convergence. Extensive testbed experiments for real training confirm the superiority of our approach, over multiple alternatives, under various scenarios, decreasing at least 30.2% energy consumption, while preserving the accuracy of the model.
引用
收藏
页码:190 / 198
页数:9
相关论文
共 39 条
[1]   Some new estimates of the 'Jensen gap' [J].
Abramovich, Shoshana ;
Persson, Lars-Erik .
JOURNAL OF INEQUALITIES AND APPLICATIONS, 2016, :1-9
[2]  
Agrawal Akshay, 2018, Journal of Control and Decision, V5, P42, DOI 10.1080/23307706.2017.1397554
[3]   Potentials, trends, and prospects in edge technologies: Fog, cloudlet, mobile edge, and micro data centers [J].
Bilal, Kashif ;
Khalid, Osman ;
Erbad, Aiman ;
Khan, Samee U. .
COMPUTER NETWORKS, 2018, 130 :94-120
[4]  
Bonawitz K., 2019, ARXIV190201046
[5]  
Boyd S., 2004, Convex Optimization, DOI 10.1017/CBO9780511804441
[6]   A Virtual-Queue-Based Algorithm for Constrained Online Convex Optimization With Applications to Data Center Resource Allocation [J].
Cao, Xuanyu ;
Zhang, Junshan ;
Poor, H. Vincent .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (04) :703-716
[7]   Communication-Efficient Federated Learning with Adaptive Parameter Freezing [J].
Chen, Chen ;
Xu, Hong ;
Wang, Wei ;
Li, Baochun ;
Li, Bo ;
Chen, Li ;
Zhang, Gong .
2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, :1-11
[8]   A Joint Learning and Communications Framework for Federated Learning Over Wireless Networks [J].
Chen, Mingzhe ;
Yang, Zhaohui ;
Saad, Walid ;
Yin, Changchuan ;
Poor, H. Vincent ;
Cui, Shuguang .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (01) :269-283
[9]  
Diamond S, 2016, J MACH LEARN RES, V17
[10]  
Haddadpour F, 2021, PR MACH LEARN RES, V130