Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods

被引:146
作者
Drusvyatskiy, Dmitriy [1 ]
Lewis, Adrian S. [2 ]
机构
[1] Univ Washington, Dept Math, Seattle, WA 98195 USA
[2] Cornell Univ, Sch Operat Res & Informat Engn, Ithaca, NY 14853 USA
基金
美国国家科学基金会;
关键词
proximal algorithm; error bound; quadratic growth; linear convergence; subregularity; subdifferential; tilt-stability; 2ND-ORDER VARIATIONAL ANALYSIS; TILT STABILITY; METRIC REGULARITY; ALGORITHM; OPTIMALITY; CALCULUS;
D O I
10.1287/moor.2017.0889
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The proximal gradient algorithm for minimizing the sum of a smooth and nonsmooth convex function often converges linearly even without strong convexity. One common reason is that a multiple of the step length at each iteration may linearly bound the " error"-the distance to the solution set. We explain the observed linear convergence intuitively by proving the equivalence of such an error bound to a natural quadratic growth condition. Our approach generalizes to linear and quadratic convergence analysis for proximal methods (of Gauss-Newton type) for minimizing compositions of nonsmooth functions with smooth mappings. We observe incidentally that short step-lengths in the algorithm indicate near-stationarity, suggesting a reliable termination criterion.
引用
收藏
页码:919 / 948
页数:30
相关论文
共 56 条
  • [1] [Anonymous], MATH PROGRAMMING
  • [2] [Anonymous], 2006, VARIATIONAL ANAL GEN
  • [3] [Anonymous], ARXIV151101635
  • [4] [Anonymous], 2015, ARXIV150406298
  • [5] [Anonymous], IMPLICIT FUNCTIONS
  • [6] [Anonymous], 2016, ARXIV160206661
  • [7] [Anonymous], MATH PROGRAMMING
  • [8] [Anonymous], 2002, NONCON OPTIM ITS APP
  • [9] [Anonymous], 2016, ARXIV160500125
  • [10] [Anonymous], 1993, Set-Valued Analysis, DOI DOI 10.1007/BF01027691