Parallel computing subgradient method for nonsmooth convex optimization over the intersection of fixed point sets of nonexpansive mappings

被引:0
作者
Hideaki Iiduka
机构
[1] Meiji University,Department of Computer Science
来源
Fixed Point Theory and Applications | / 2015卷
关键词
fixed point; Krasnosel’skiĭ-Mann algorithm; nonexpansive mapping; nonsmooth convex optimization; parallel algorithm; subgradient; 65K05; 90C25; 90C90;
D O I
暂无
中图分类号
学科分类号
摘要
Nonsmooth convex optimization problems are solved over fixed point sets of nonexpansive mappings by using a distributed optimization technique. This is done for a networked system with an operator, who manages the system, and a finite number of users, by solving the problem of minimizing the sum of the operator’s and users’ nondifferentiable, convex objective functions over the intersection of the operator’s and users’ convex constraint sets in a real Hilbert space. We assume that each of their constraint sets can be expressed as the fixed point set of an implementable nonexpansive mapping. This setting allows us to discuss nonsmooth convex optimization problems in which the metric projection onto the constraint set cannot be calculated explicitly. We propose a parallel subgradient algorithm for solving the problem by using the operator’s attribution such that it can communicate with all users. The proposed algorithm does not use any proximity operators, in contrast to conventional parallel algorithms for nonsmooth convex optimization. We first study its convergence property for a constant step-size rule. The analysis indicates that the proposed algorithm with a small constant step size approximates a solution to the problem. We next consider the case of a diminishing step-size sequence and prove that there exists a subsequence of the sequence generated by the algorithm which weakly converges to a solution to the problem. We also give numerical examples to support the convergence analyses.
引用
收藏
相关论文
共 53 条
[1]  
Combettes PL(2003)A block-iterative surrogate constraint splitting method for quadratic signal recovery IEEE Trans. Signal Process. 51 1771-1782
[2]  
Iiduka H(2013)Fixed point optimization algorithms for distributed optimization in networked systems SIAM J. Optim. 23 1-26
[3]  
Iiduka H(2015)Acceleration method for convex optimization over the fixed point set of a nonexpansive mapping Math. Program. 149 131-165
[4]  
Iiduka H(2014)Acceleration method combining broadcast and incremental distributed optimization algorithms SIAM J. Optim. 24 1840-1863
[5]  
Hishinuma K(2009)A use of conjugate gradient direction for the convex optimization problem over the fixed point set of a nonexpansive mapping SIAM J. Optim. 19 1881-1893
[6]  
Iiduka H(2014)A viscosity method with no spectral radius requirements for the split common fixed point problem Eur. J. Oper. Res. 235 17-27
[7]  
Yamada I(2011)Algorithms of common solutions for variational inclusions, mixed equilibrium problems and fixed point problems Eur. J. Oper. Res. 212 242-250
[8]  
Maingé PE(2014)VI-constrained hemivariational inequalities: distributed algorithms and power control in ad-hoc networks Math. Program. 145 59-96
[9]  
Yao Y(2011)Iterative algorithm for solving triple-hierarchical constrained optimization problem J. Optim. Theory Appl. 148 580-592
[10]  
Cho YJ(2012)Iterative algorithm for triple-hierarchical constrained nonconvex optimization problem and its application to network bandwidth allocation SIAM J. Optim. 22 862-878