Tight global linear convergence rate bounds for Douglas-Rachford splitting

被引:47
作者
Giselsson, Pontus [1 ]
机构
[1] Lund Univ, Dept Automat Control, Box 118, SE-22100 Lund, Sweden
关键词
Douglas-Rachford splitting; Linear convergence; Monotone operators; Fixed-point iterations; ALTERNATING DIRECTION METHOD; PROJECTIONS; MULTIPLIERS; ALGORITHMS; ADMM;
D O I
10.1007/s11784-017-0417-1
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Recently, several authors have shown local and global convergence rate results for Douglas-Rachford splitting under strong monotonicity, Lipschitz continuity, and cocoercivity assumptions. Most of these focus on the convex optimization setting. In the more general monotone inclusion setting, Lions and Mercier showed a linear convergence rate bound under the assumption that one of the two operators is strongly monotone and Lipschitz continuous. We show that this bound is not tight, meaning that no problem from the considered class converges exactly with that rate. In this paper, we present tight global linear convergence rate bounds for that class of problems. We also provide tight linear convergence rate bounds under the assumptions that one of the operators is strongly monotone and cocoercive, and that one of the operators is strongly monotone and the other is cocoercive. All our linear convergence results are obtained by proving the stronger property that the Douglas-Rachford operator is contractive.
引用
收藏
页码:2241 / 2270
页数:30
相关论文
共 33 条
[1]  
[Anonymous], 1989, THESIS
[2]   Optimal Rates of Linear Convergence of Relaxed Alternating Projections and Generalized Douglas-Rachford Methods for Two Subspaces [J].
Bauschke, Heinz H. ;
Bello Cruz, J. Y. ;
Nghia, Tran T. A. ;
Pha, Hung M. ;
Wang, Xianfu .
NUMERICAL ALGORITHMS, 2016, 73 (01) :33-76
[3]   The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle [J].
Bauschke, Heinz H. ;
Cruz, J. Y. Bello ;
Nghia, Tran T. A. ;
Phan, Hung M. ;
Wang, Xianfu .
JOURNAL OF APPROXIMATION THEORY, 2014, 185 :63-79
[4]  
Bauschke HH, 2011, CMS BOOKS MATH, P1, DOI 10.1007/978-1-4419-9467-7
[5]   LOCAL LINEAR CONVERGENCE OF THE ALTERNATING DIRECTION METHOD OF MULTIPLIERS ON QUADRATIC OR LINEAR PROGRAMS [J].
Boley, Daniel .
SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (04) :2183-2207
[6]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[7]   Compositions and convex combinations of averaged nonexpansive operators [J].
Combettes, Patrick L. ;
Yamada, Isao .
JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2015, 425 (01) :55-70
[8]   Proximal Splitting Methods in Signal Processing [J].
Combettes, Patrick L. ;
Pesquet, Jean-Christophe .
FIXED-POINT ALGORITHMS FOR INVERSE PROBLEMS IN SCIENCE AND ENGINEERING, 2011, 49 :185-+
[9]  
Davis D., 2014, FASTER CONVERGENCE R
[10]  
Davis D, 2016, SCI COMPUT, P115, DOI 10.1007/978-3-319-41589-5_4