From Halpern's fixed-point iterations to Nesterov's accelerated interpretations for root-finding problems

被引:6
作者
Quoc, Tran-Dinh [1 ]
机构
[1] Univ North Carolina Chapel Hill, Dept Stat & Operat Res, 333 Hanes Hall,CB 3260,UNC, Chapel Hill, NC 27599 USA
基金
美国国家科学基金会;
关键词
Halpern's fixed-point iteration; Nesterov's accelerated method; Co-coercive equation; Maximally monotone inclusion; Extra-anchored gradient method; MONOTONE-OPERATORS; CONVERGENCE; ALGORITHMS; APPROXIMATION;
D O I
10.1007/s10589-023-00518-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We derive an equivalent form of Halpern's fixed-point iteration scheme for solving a co-coercive equation (also called a root-finding problem), which can be viewed as a Nesterov's accelerated interpretation. We show that one method is equivalent to another via a simple transformation, leading to a straightforward convergence proof for Nesterov's accelerated scheme. Alternatively, we directly establish convergence rates of Nesterov's accelerated variant, and as a consequence, we obtain a new convergence rate of Halpern's fixed-point iteration. Next, we apply our results to different methods to solve monotone inclusions, where our convergence guarantees are applied. Since the gradient/forward scheme requires the co-coerciveness of the underlying operator, we derive new Nesterov's accelerated variants for both recent extra-anchored gradient and past-extra anchored gradient methods in the literature. These variants alleviate the co-coerciveness condition by only assuming the monotonicity and Lipschitz continuity of the underlying operator. Interestingly, our new Nesterov's accelerated interpretation of the past-extra anchored gradient method involves two past-iterate correction terms. This formulation is expected to guide us developing new Nesterov's accelerated methods for minimax problems and their continuous views without co-coericiveness. We test our theoretical results on two numerical examples, where the actual convergence rates match well the theoretical ones up to a constant factor.
引用
收藏
页码:181 / 218
页数:38
相关论文
共 56 条
[11]  
Bot RI, 2023, Arxiv, DOI arXiv:2206.09462
[12]  
Bubeck S, 2015, Arxiv, DOI arXiv:1506.08187
[13]  
Burachik RS, 2008, SPRINGER SER OPTIM A, V8, P1
[14]   On the Convergence of the Iterates of the "Fast Iterative Shrinkage/Thresholding Algorithm" [J].
Chambolle, A. ;
Dossal, Ch. .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2015, 166 (03) :968-982
[15]   Signal recovery by proximal forward-backward splitting [J].
Combettes, PL ;
Wajs, VR .
MULTISCALE MODELING & SIMULATION, 2005, 4 (04) :1168-1200
[16]  
dAspremont A., 2021, Trends Optim, V5, P1, DOI DOI 10.1561/2400000036
[17]   A Three-Operator Splitting Scheme and its Optimization Applications [J].
Davis, Damek ;
Yin, Wotao .
SET-VALUED AND VARIATIONAL ANALYSIS, 2017, 25 (04) :829-858
[18]  
Diakonikolas J, 2021, PR MACH LEARN RES, V130
[19]  
Diakonikolas J, 2020, PR MACH LEARN RES, V125
[20]  
Facchinei F, 2003, Springer Series in Operations Research, VI and II