ε-subgradient algorithms for bilevel convex optimization

被引:12
作者
Helou, Elias S. [1 ]
Simoes, Lucas E. A. [2 ]
机构
[1] Univ Sao Paulo, Dept Appl Math & Stat, Inst Math Sci & Computat, Sao Carlos, SP, Brazil
[2] Univ Estadual Campinas, Inst Math Stat & Sci Comp, Campinas, SP, Brazil
基金
巴西圣保罗研究基金会;
关键词
epsilon-subgradient methods; bilevel optimization; tomographic image reconstruction; LARGE UNDERDETERMINED SYSTEMS; PROXIMAL POINT ALGORITHM; GRADIENT METHODS; MINIMIZATION; CONVERGENCE; TOMOGRAPHY; EQUATIONS; NORM;
D O I
10.1088/1361-6420/aa6136
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper introduces and studies the convergence properties of a new class of explicit e-subgradient methods for the task of minimizing a convex function over a set of minimizers of another convex minimization problem. The general algorithm specializes to some important cases, such as first-order methods applied to a varying objective function, which have computationally cheap iterations. We present numerical experimentation concerning certain applications where the theoretical framework encompasses efficient algorithmic techniques, enabling the use of the resulting methods to solve very large practical problems arising in tomographic image reconstruction.
引用
收藏
页数:33
相关论文
共 38 条
[1]   A first order method for finding minimal norm-like solutions of convex optimization problems [J].
Beck, Amir ;
Sabach, Shoham .
MATHEMATICAL PROGRAMMING, 2014, 147 (1-2) :25-46
[2]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[3]   Incremental proximal methods for large scale convex optimization [J].
Bertsekas, Dimitri P. .
MATHEMATICAL PROGRAMMING, 2011, 129 (02) :163-195
[4]   A new class of incremental gradient methods for least squares problems [J].
Bertsekas, DP .
SIAM JOURNAL ON OPTIMIZATION, 1997, 7 (04) :913-926
[5]   Gradient convergence in gradient methods with errors [J].
Bertsekas, DP ;
Tsitsiklis, JN .
SIAM JOURNAL ON OPTIMIZATION, 2000, 10 (03) :627-642
[6]   A convergent incremental gradient method with a constant step size [J].
Blatt, Doron ;
Hero, Alfred O. ;
Gauchman, Hillel .
SIAM JOURNAL ON OPTIMIZATION, 2007, 18 (01) :29-51
[7]   A discrepancy-based parameter adaptation and stopping rule for minimization algorithms aiming at Tikhonov-type regularization [J].
Bredies, Kristian ;
Zhariy, Mariya .
INVERSE PROBLEMS, 2013, 29 (02)
[8]   Proximal point algorithm controlled by a slowly vanishing term: Applications to hierarchical minimization [J].
Cabot, A .
SIAM JOURNAL ON OPTIMIZATION, 2005, 15 (02) :555-572
[9]   Robust uncertainty principles:: Exact signal reconstruction from highly incomplete frequency information [J].
Candès, EJ ;
Romberg, J ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (02) :489-509
[10]   Stable signal recovery from incomplete and inaccurate measurements [J].
Candes, Emmanuel J. ;
Romberg, Justin K. ;
Tao, Terence .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (08) :1207-1223