Asymptotic properties of dual averaging algorithm for constrained distributed stochastic optimization

被引:1
|
作者
Zhao, Shengchao [1 ]
Chen, Xing-Min [1 ]
Liu, Yongchao [1 ]
机构
[1] Dalian Univ Technol, Sch Math Sci, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
Constrained distributed stochastic optimization; Distributed dual averaging method; Almost sure convergence; Asymptotic normality; Asymptotic efficiency; MULTIAGENT OPTIMIZATION; RESOURCE-ALLOCATION; RANDOM NETWORKS; APPROXIMATION; CONVERGENCE;
D O I
10.1016/j.sysconle.2022.105252
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Considering the constrained stochastic optimization problem over a time-varying random network, where the agents are to collectively minimize a sum of objective functions subject to a common constraint set, we investigate asymptotic properties of a distributed algorithm based on dual averaging of gradients. Different from most existing works on distributed dual averaging algorithms that are mainly focused on their non-asymptotic properties, we prove not only almost sure convergence and rate of almost sure convergence, but also asymptotic normality and asymptotic efficiency of the algorithm. Firstly, for general constrained convex optimization problem distributed over a random network, we prove that almost sure consensus can be achieved and the estimates of agents converge to the same optimal point. For the case of linear constrained convex optimization, we show that the mirror map of the averaged dual sequence identifies the active constraints of the optimal solution with probability 1, which helps us to prove the almost sure convergence rate and then establish asymptotic normality of the algorithm. Furthermore, we also verify that the algorithm is asymptotically optimal. To the best of our knowledge, it is the first asymptotic normality result for constrained distributed optimization algorithms. Finally, a numerical example is provided to justify the theoretical analysis. (C) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Dual Averaging Push for Distributed Convex Optimization Over Time-Varying Directed Graph
    Liang, Shu
    Wang, Le Yi
    Yin, George
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2020, 65 (04) : 1785 - 1791
  • [42] Cooperative convex optimization with subgradient delays using push-sum distributed dual averaging
    Wang, Cong
    Xu, Shengyuan
    Yuan, Deming
    Chu, Yuming
    Zhang, Zhengqiang
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2021, 358 (14): : 7254 - 7269
  • [43] Asymptotic Properties of a Generalized Cross-Entropy Optimization Algorithm
    Wu, Zijun
    Kolonko, Michael
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2014, 18 (05) : 658 - 673
  • [44] A Distributed Buffering Drift-Plus-Penalty Algorithm for Coupling Constrained Optimization
    Wang, Dandan
    Zhu, Daokuan
    Ou, Zichong
    Lu, Jie
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 3944 - 3949
  • [45] A Distributed Nesterov-Like Gradient Tracking Algorithm for Composite Constrained Optimization
    Zheng, Lifeng
    Li, Huaqing
    Li, Jun
    Wang, Zheng
    Lu, Qingguo
    Shi, Yawei
    Wang, Huiwei
    Dong, Tao
    Ji, Lianghao
    Xia, Dawen
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 60 - 73
  • [46] A method combining genetic algorithm with simultaneous perturbation stochastic approximation for linearly constrained stochastic optimization problems
    Zhang Huajun
    Jin, Zhao
    Hui, Luo
    JOURNAL OF COMBINATORIAL OPTIMIZATION, 2016, 31 (03) : 979 - 995
  • [47] Continuous-Time Algorithm For Distributed Constrained Optimization Over Directed Graphs
    Yang, Qiang
    Chen, Gang
    Ren, Jianghong
    2019 IEEE 15TH INTERNATIONAL CONFERENCE ON CONTROL AND AUTOMATION (ICCA), 2019, : 1020 - 1025
  • [48] S-DIGing: A Stochastic Gradient Tracking Algorithm for Distributed Optimization
    Li, Huaqing
    Zheng, Lifeng
    Wang, Zheng
    Yan, Yu
    Feng, Liping
    Guo, Jing
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (01): : 53 - 65
  • [49] NESTT: A Nonconvex Primal-Dual Splitting Method for Distributed and Stochastic Optimization
    Hajinezhad, Davood
    Hong, Mingyi
    Zhao, Tuo
    Wang, Zhaoran
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [50] A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration
    Sun, Bihao
    Hu, Jinhui
    Xia, Dawen
    Li, Huaqing
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2021, 22 (11) : 1463 - 1476