共 30 条
A FAST DUAL GRADIENT METHOD FOR SEPARABLE CONVEX OPTIMIZATION VIA SMOOTHING
被引:0
作者:
Li, Jueyou
[1
]
Wu, Zhiyou
[1
]
Wu, Changzhi
[2
]
Long, Qiang
[3
]
Wang, Xiangyu
[2
]
Lee, Jae-Myung
[4
,5
,6
]
Jung, Kwang-Hyo
[4
]
机构:
[1] Chongqing Normal Univ, Sch Math Sci, Chongqing 400047, Peoples R China
[2] Curtin Univ, Sch Built Environm, Australasian Joint Res Ctr Bldg Informat Modellin, Perth, WA 6845, Australia
[3] Southwest Univ Sci & Technol, Sch Sci, Mianyang 621010, Sichuan, Peoples R China
[4] Pusan Natl Univ, Dept Naval Architecture & Ocean Engn, Busan, South Korea
[5] Pusan Natl Univ, Integrat Grad Program Ship & Offshore Plant Techn, BK21Plus, Busan, South Korea
[6] Pusan Natl Univ, Cryogen Mat Res Inst, Busan, South Korea
来源:
PACIFIC JOURNAL OF OPTIMIZATION
|
2016年
/
12卷
/
02期
基金:
新加坡国家研究基金会;
澳大利亚研究理事会;
关键词:
convex optimization;
dual decomposition;
smoothing technique;
fast gradient method;
parallel computation;
LAGRANGIAN DECOMPOSITION;
NEURAL-NETWORKS;
MINIMIZATION;
STABILITY;
D O I:
暂无
中图分类号:
C93 [管理学];
O22 [运筹学];
学科分类号:
070105 ;
12 ;
1201 ;
1202 ;
120202 ;
摘要:
This paper considers a class of separable convex optimization problems with linear coupled constraints arising in many applications. Based on a novel smoothing technique, a simple fast dual gradient method is presented to solve this class of problems. Then the convergence of the proposed method is proved and the computational complexity bound of the method for achieving an approximately optimal solution is given explicitly. An improved iteration complexity bound is obtained when the objective function of the problem is strongly convex. Our algorithm is simple and fast, which can be implemented in a parallel fashion. Numerical experiments on a network utility maximization problem are presented to illustrate the effectiveness of the proposed algorithm.
引用
收藏
页码:289 / +
页数:17
相关论文